var/home/core/zuul-output/0000755000175000017500000000000015140117513014523 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015140133110015456 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000322735415140132717020267 0ustar corecoreϵikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB…ҟ"mv?_eGbuuțx{w7ݭ7֫e% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{3CF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2UG7E_hdxMUY.b=eaI3Z=᢬-'~DWc;j FRrI5%N/K;Dk rCbm7чsSW_8g{RY.~XfEߪg:smBi1 YBX4),[c^54Sg(s$sN' 88`wC3TE+A\.ԍל9 y{͝BxG&JS meT;{З>'[LR"w F05N<&AJ3DA0ʄ4(zTUWDdE3̻l^-Xw3Fɀ{B-~.h+U8 i1b8wؖ#N7z7~oB(ъ{zZJ }z&OF wkߓG9!1u8^drKkJBxF&+62,b.-Z*qqdX>$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxx7/W%g!&^=SzDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq<-uˮ◶>waPcPw3``m- } vS¢=j=1 W=&;JW(7b ?Q.|K,ϩ3g)D͵Q5P"$P>x7v21՚H :[Γd!E'a4n?k[A׈(sob 41Y9(^SE@7`KIK`kx& V`X0,%pe_ן >hd xе"Q4SUwy x<'o_~#6$g!D$c=5ۄX[ു RzG:柺[ӏ[3frl ô ހ^2TӘUAT!94[[m۾\T)W> lv+ H\FpG)ۏjk_c51̃^cn ba-X/#=Im41NLu\9ETp^poAOO&Ack vz(vb$^Nyo$p[DtUCE9s".zɪ) ӓT)D:fci[*`cc&VhfFp佬)/Wdځ+ uR<$}Kr'ݔTW$md1"#mC_@:m P>DEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP__gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ1J{ڔhэK'9*W )IYO;E4z⛢79"hK{BFEmBAΛ3>IO j u߿d{=t-n3Pnef9[}=%G*9sX,¬xS&9'E&"/"ncx}"mV5tŘ:wcZ К G)]$mbXE ^ǽ8%>,0FЕ 6vAVKVCjrD25#Lrv?33Iam:xy`|Q'eű^\ơ' .gygSAixپ im41;P^azl5|JE2z=.wcMԧ ax& =`|#HQ*lS<.U׻`>ajϿ '!9MHK:9#s,jV剤C:LIeHJ"M8P,$N;a-zݸJWc :.<sR6 լ$gu4M*B(A ݖΑِ %H;S*ڳJt>$M!^*n3qESfU, Iĭb#UFJPvBgZvn aE5}~2E|=D' ܇q>8[¿yp/9Om/5|k \6xH.Z'OeCD@cq:Y~<1LٖY9# xe8g IKTQ:+Xg:*}.<M{ZH[^>m0G{ ̷hiOO|9Y"mma[sSbb'Rv&{@6; KE.a\}:<]Oyve3h9}E[kMD,5 %sO{킒 8.K?]i/`׎tp NvԻV4|<{H@#*h{Yp/E%dlh\bU:E%h@&SEK [ Ƣ xg{z%ǻViX~鮦w35QE~qp[ʕ@}ZL! Z0!A⼏q)[f &E1K3i+`JG P/EG 4 9LڑKL|`PОnG#|}qOR{Q|2_tH߫%pD?1%(@nfxOrs25rMլf{sk7݇fjӞh2HkeL'Wʿ}Ƞ%>9cSH|cEyQp 'ˢd:,v-us"Iidw>%zM@9IqrGq:&_p3õB!>9'0LL]M[lwWVR9I5YpVgtuZfG{RoZr3ٮr;wW:͋nqCRu1y=㊻Ij z[|W%q0 CJV٨3,ib{eH7 mҝ(3ɏO/̗-=OR\dIoHZ6n`R֑&#.Mv0vԬ]I˟vrK}F9X|FI#g.Gi)%!iK|o}|ֵ7!ېATJKB2Z/"BfB(gdj۸=}'),-iX'|M2roK\e5Pt:*qSH PgƉU'VKξ ,!3`˞t1Rx}fvvPXdQSg6EDT:dׁz^DjXp͇G|X5Q9K$)U?o': .,wؓaՁ_ 3]Q16ZYafuvrq^ѷQT},!H]6{Jw>%wK{)rH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnvB @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'h?kNg?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=v{lLHqyXR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb { wml"Ms>\΋"?|NKfֱn !ڄ`[nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28¢E WiBhFa[|ݩSRO3]J-҅31,jl3Y QuH vΎ]n_2a62;VI/ɮ|Lu>'$0&*m.)HzzBvU0h} -_.7^nya+Cs 6K!x^' ^7 l 2Jj.S֔(*CjaS:vp/N6I*x8"EȿQa[qVM/)fpOj4r!:V_IG^nILVG#A7jF};qPU嗈M9VS;a+Ӧ8E8zmMs*7NM~@6 ' 8jp*:'SOANa0rӍ?DT%l)gvN}JT(Ȋqm|dc+lQai,|Dߟ|, d#EjZܴv]pEO7}&gbXԈedKX :+Z|p8"81,w:$TiVD7ֶ]cga@>\X=4OZSܿ* %xccDa.E h :R.qɱMu$ơI8>^V Y. ,BLq~z&0o- ,BLqfx9y:9244ANb n\"X>Y`bb*h%)(*_Gra^ sh6"BzƾH( ."e)B QlKlXt҈t9՚$ضz]'.!-r"1MCĦʸ"66pE{ =CNc\ESD[T4azry !5yY~ :3;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to^ X]j?-ȇlCf0hM`~ ó}0W@o  K[{d+`ze"l |d;L2k%x90ݙ^Oe ]nHfS+.4<#/5߁ݛǪ0q,7FeV/!; 瓠 Li% z}ɯww"O-]J`sdN$@"J`Y13K/9`VTElsX|D^c%֯T][$m;ԝ!,Z5f`XFzȁ=nrSA8; P=uY}r/27OUa%~0;үM3Tu ȩ*'3IC~LG,?.?C3tBYpm_g.~>3ʄ55[c&-Wgy_jVo,?s w*n\7[cpMY<~/"˘oV܉T6nn \_ߋV_}Z=k-nn sn.*upw pX\_ U-C_wS!|q?E-S_w$-#9?mh{R 4ѭm_9p -h2 dֲ 1"j {]]Nk"䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑO2Jh)8Vgl0v/eNEU"Ik dRu˜6Uǖ xs%P ع omWl҈sApX!^ Ɩgv{Xn|$̇d`>1Ljn떚F+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~@9{bOe@7ȱ^%u~-B竟} |23 Z.`oqD>t@N _7c$h3`lg\)[h+pHBr^J |r\8czEnv@qZbRT1e8V Scc6:$[|a.fpU`ZR֩bKgTlѩynۢ, "1LӰW&jDkM~# (C>ϭQ3{ߤ%EN;?P%ٱm -{2k 8Vbv"wŏݙmn&O1^'}plM)0\n ή ?Cֲa9H] lX9^vCο -vd+OUgRy2Я\ B0!% #>bJPUck\Ul'F瘏Y4Ew`[x٘p,>9V"R1I>bJ` UL'5m1Ԥ:t6I >jz(:W֪Ƹ)!fꠗe[XLE4atGS1px#S]MF˦NJPYDX%ܠꡗhl}i9f?q>b-E'V"mNf""ŦK9kǍ-vU #`uVi<s)/=r=nlӗЩsdLyVIUI':4^6& t,O669Ȁ,EʿkڍfC58$5?DX 4q]ll9W@/zNaZf% >Ę_"+BLu>'Ɩ=xɮ[⠋X((6I#z)2S zp&m?e8 "(O+:Y EaSD]<^(]В|Ǚ8"oRs?]\McZ0ϕ!1hKS`h0O{!L-w]ln2&0Ǚ'0=.T4G7! H/ͺ|@lX)+{{^s1V63 ۗI"*al NJ`Q8B\pup6_3XqCXznL9:{o qcuו8`n{ave=}OR9~yL Z1=W8>É R$|L ]OfJl˪VVg:lDԒ͢Zu[kWۗw{{7st08`J0ꨴU1|z:9dX)z2!S:'q9| 76"Q;D*04Zٚ ?V¼r8/G:T6Fw/ɚ~h?lUc3MکEen壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁV?mdI/J;u1Nxg2AlJ\S$Ëlͯ*%v*NXD*V׵+qwD E\YzdG¤^jS8T%43j` 1?T]x@ Rwdq)wr CUR&!\̓ly)bT뢬6T=I.A!R$r Sp !3j"y}AR,: pHwj8T e~cPUaYO'b_arp9vȡ_GW8<fP?-?^$鉎4cӃ8W)?\470 wP%O d|SRmSRvbtSRv +% йe/X4_WE'ŕkv$=ssuw/ ,,ǐvd/@LGhcAh ] ]ЎDCL;Bx"cȊgE5a#WY{x|9cz%ò^m~j=J:`,z,?rjÓn,2 kc4d ,n/HLIӒ/H'3 ~I⦶Rd C]{xIzzp0 K40ԥ Noml#44Gg\Ԙ+tM8;A@~wO",lMڗZ ^ MZ M!}r2Um|2|ͯUw-`6kW%E3YuA ?`":1L~Py|`ۦkZY/!b<^KXo8<XkxK$߉ ݕA[lq @  kyv^L&=zKDe278 8uLU3%-gqTM {LŒ=+c~hچXROG%"eu.0QqWÁe 6`h;xڋRe3cyc#m#0LYoӑY Px.5y"@D(90%*g\bUYV' >9B.b1frPX2As|e_eK"`.ͪ*/_OnU9iPH~EE_P =b*# [FQ(|;.g,cpUDK[堁!:"AR$9>#1݆CCN;(op& P-XLSpeMԄˏ{) b(:LE^β1ϧSpn堁߳2KEJ?mo[I遀նf4DY]h.gφ2؄c<'iy!4RQ&˪&SYfri]BYg]?=%aFQyu%q]@B5U6»*m[E0ʊ@j)MW65l_exPct@$Zƒ5W= e$:O߾R ƒ;@ OjO kkIdyR{xN嚛;ͱ\R q^;29Aq Ek&-X:X׍QG j93,k@әV`β  ']FWzm.z-1Ҩ¶}>r%.4wCpM:uŚnZ17ǐj ħ ."Km7[񫇾6{|>d8<.fAFN\zטIS1˱v[硨fhInY*Wes !C9TWqHsg{{ѷZ,Z)]s1ɂ\[c]X. q_[csn4"Y, D\,f$\Z(hU;qoQf:s,3܅FZ/B8}Uw(w1'jCu=]CE[!X3ǜB {HIg:y*L ,~Bgfؐ>6u}J39mrU5/hTltն*)S=tUn&2e*B@n"CW UF>ꪢ*˳$.{?Q\NHs {O$_n9Z%iUę4 Zk.U>^3,$PX +}𬹶mҰNؚ0C ŧnc {~CЁ_ss1bX f'8 !ܷ`@E6p#LA3#ư4t.w2Ƈ7t@yq}8wo0s' o 0juEF">!nj?:&vpswf@XLywPg3{g03fX(g N׈z1Y:\7(GBChd-}# 3\gb`p\wA ^ l 5Iz1 f1p&%gI-me3F }0S27!0r*+X o aCKķ.uyz|ekp[Gp A' &r0߉}|떚 [<|#Ӌߛ)ԑks,|%}UD?P=>Wś/$Hlϙ_D?ZWڜeQ!4)ZfkEs/<p{K iW7iInd ߺ2h(֝B>2MZA];o$\X1 6%]*DXBgJƕ_AMAk<1?;٪$HD"HǺ߷)G6ٝ$I$|ݸ}s4_n>mUȿyJ/jl,@8ט<߽~|(9«0UNƋ UQ^30Y~Xw cSOe\~$#_f12U#IDS2=[S *U3XWo8)@ao/xFݻwl9lc#<˷'fRްB?"OC6QNg\VxML|m> ar( ۃ=?d> 06:%!$A~<T؞hHayCi7 'Ba^<5oaġ`A cxGPT!y2Pia jwLP݉#y7,mE#ܧ%Y'Lű~L&biF(W"X Zyi Y8u᲏XAQ4J#UiĢ!RWB߿WHLEih8ӴmCA8yH?nBG;.q:`\ u'iGOźYf'  ]_} h9~@1wmv#?颈Pl΀`u[ F(݁9/(4R $~pI{hoEBSBv7=xus:1-zV1GCM9VWyG+wcmuChlS9.}q2sRw L=Ks%suZynzij,Ϣv]=z?4s5G<+%Z6輌='>GIbw$6SV2pf"PLg Ly8vWB49bj)ḁ/7ƙZNX8s/`ï^$__Kb{ے W`BO" IKg=%))P; #.p(bD׈G8hEpH>H$-%ыsw,oyDEˡu.(DGe2,1ܒYd\Sb b8IGC v8Vx?fh~C7]~,(mSwxãɫ',,EX]%0i.Xg[qD16-(q`&^y,-j7YYĴGnr$.NxfzSYҀr6]:\h(ƅAFN< q=8W끘 s l 7%쓟auf" Ia"3GP y4\Һ *揘iNβuQ6)=Q6TxWB_!)Ӌ 4gXri- < Qe9,#BX Vj9q9`8`3˦Dz8(A1 zeYl$̣ހ2`;SW pXGed_{l\p=0*01;kN%>$'䊒B^C}Lۜ6^Aq;+E2#+TIaDI*z .C=$u{Fy ۩MXRJpAwVQ=9;;|R: t8 s¾Ĭ?wYţI8^šU2 ݔ$Ϯg8#a<2<̲ȥuqEtkHVHl /YJP]E`i'lR ٸ$#8M@`~#55Ĩa0eE'[۫V)of :1C@!,odTs孵J-JjUpadTPɺcz4F&(B'0y/N*R"ӽqr8wdq^ ~" _ޖM5t*ΗlFw%ӼE_ΗjEX&Cq}T%|y05gW),}W5:X I@HQ|O5Դ<1K܃˻ 6o3M8E5ȣܷ˝Ѝ {ndc[Q]37NxB#rtq?ٷL23CԳg 5 yHQ=[_=0**x_R]#=jne.<[FME7,jz5oZ+o5[r@P0 b4c-q``b&?Gu͖p]a޲)f>-:MwZ%~`9܉st3Dsmss<77}= gbm`x P7@dٚlV1Z9q]& 3 p6FE?پ C`L<"qɚ/jpTV|6`Xg峰]ZӍfQwؐLxvim0`=mt>Ga֧8GA?67 /X5tZA**J[}>Qwu_|QFa\z+b+ܢU#/On6#u"o]rL l-b!ۊB({"V;"E9<(% &M/aoU5tPtwB$BZ]>S{>[Nޱ;c}ݥw*lbI)ہPJ(߁P;iGw Z%ځPkwBj=PkBUBw'~# w Y%فPgwBIGTh;0qyt~/$8{RJw&rJQaҵM_c@݈h!Q|r+|&o{@ٷ;$xιca#dC?_AEwg<8bI5A9IdA JOH @5mG6A޲G Y^U??B[V]Dcu)iKnJֺr̓U2cVpe,[Y"pygy%W0gXڬdd9| --fSh/]&Ӥ'}~Qmpo*1hSĝ 2` ujSz*~.)ˏaW ' A?HaMCEcxLhmF \Ls GivD0J:p˧!L[i~@ 7E A[e“!|o6|0#=):)C@w>P)'UX~fbNЧ1ݫ' О ޭAz/![HMLţ<>N]:MaeLߩ &. \8xs*b60nΪ%3%Y js^C~ozVQ57H9H"&Ӭ/AYb !#5caើAaXNJ!bLDRf!,G՝l1AHQ S>,s"# j5BW`N#.F&SdJU*":m$+ S1`@ Qa@TY^kL.OAKdreW=8o 0//yƮx7ߖKx$oH1W1Br9ȇ) ϗñHu+]V̀0 ,?@m2';G:ѕs$>M$=<"-xx7F_<=4*T0QS>2F O墳;Ԭ#}_#^.6 1/N0k|$1|*1< ^c]2AVtKsU#qٖMl/ރYV*Zc*B:G!p\'BTd =V5تB3DW"S6{v뼸'}9_$E25h#y_HZl=HٸH LB,xyֵeTzpO_gac0*E&#-,͔Xd4iB[\YIFj&- ZGSv]gVF_d[sI=< 0%Lqj -L2IjCaߜ-F rcRZ6?^F`÷o*coȬD`xQ2edn*KQjÔZ"WRْf5qwy^R!&RC|$R2RR@D Y3aF>WdOxL {ė AܞO6)!N U'F &՚9S\DCjLnuF5"h*R[Љ).ߦC$Yp$3ӿ1t+k'gCѐ8u?ow &C L${O32iDjyaXi_#.!Rhv[0— Ų@~$(5^-_rlNV2}a ջw.B읊Z ~Td ߠ0NB)|ur= [^bNz9xz02qŌzvaA?B-53{øκAض PL\xkin#Iebe>{v8$DD/'k9pDJvw~O_Qvdvz럦s?8'"|/o 5W &_z#ʋ|?sNvb|䏧0f6 ) x o,`?6ݱG|38-1;o=vpҪ5\Sb ɷ~Vz[nû;-om^ȢW&}` +Zʹ~l[X"ㇸ$P{=\}SzDv /Ь=%W˓֨tki=\~}Dss=kv.:QVO~GpvfbG`bQ8dtA5­)iY F;5xL}Gm⨃V:Zin:w%c! F neE \%\Y43r ݏŰG8{)ű>zn(^"Ӎ=.%VѳfVB=n2o$7ck^A CaZbeVt,PrFp#cɆٱ\}usx)ʹCԈKC{x-XIKk0w1E_teIpc`uՊ QY׌Pq#լz'^o̴F VTVUgW1t6^2yY $%L H.HޘDJ:"D+(uؐԢ2C(L sg[eR$P_iDg>݄a 1uO4ЍlrkC3`JX8|1%"O!+OIٛ'1𡃽% %V!K3:u]/Xj_$ RDTi ^;&x S 8@6ǠpZ#>A#a.)5)Ť{uM@yI[ D9X&)K1I|hxHVq5LIu4N2ydi!x oyT脶eKpgI [ e%HbY[u>L UYjpd:'YGHUӵkJ]Y%jF\YrkٵkVTlcVm%csIW.W jhUEb6 AmR22+t G+]<Ī pN5&MXK9W w>T)&}hXbB*s4TC 7i=|g0'nio%'i돌ݑ rP.uc!z- ‚jN 1-p!*;݄iHp5cdVk%kII,,XlZ4Ah9/(B CR.ʂ9/Avsg#).9%P74wPrgQ:#ЇEi%O lnw8tZjp>tܛK>D F{eWG`7x@{n۫D5dO NMHc%rZnEYiŀfYFHS/:98s- 0 fY=g'ڳ>W4rZee&!ƩN`Wt>/lg*>h\Zn%μTᦘN'B{>F1}^f wAetؠEFbi7Vj0RˤY` f64&R2ji!!$]yOk$8j(ZcR<ڱZⓋI^ბ`cJBT%! H*tB^1}Dx4F`{ډEzsJ^ j;,hDH??HYE;o*,{n8)>HK+CU9y;!Z 6Ѓɓ.$$ m+p 6qMjN&v$83 _ ^eȸ<|wVj_CW!mܺ.W[os".rbYPSoл1hҶ#좨.0`9`8z#..ء6Qn8Fbw:rnϹk. j d~n$ ٚ1JE(yEL+z(u{x<-! #zL, #c[t|]Ʊ`4+/E0 Q5,%oQtqYTr<{G" vs@P<^޷[GbTbկV[F/7xKvsW\ DmJDQgGyAs"f7D?YCM9 I {^L}sUpy 4iH"I/*Tlwbs[ -*Dv8 B4\=NfY+0K푻CcRޘ<7+V9o$&܉t9jwdI&{'$8;?a VSuV9r!}S%~'~ c#n]AqzQEuFAsx"JfmIEMf!^ʹBMBH]$8ڱwjE iAmu%}H1FTQZ,1A[IlH6!E|g1FC%aQS=73Cˢ(PcX qC`憹IBڜ$ݪ#G8R7$=w^@[yP$ SP c˘ٳVg2b=I-]_o= :MNM0 8۳%Qe-[Ĉ6}vZ:"[6ymwǛYɹ_Dz$C/ym;3>!2,D}ԧ5=K_JE IN̐{9|m F2/m!mGx#[Š*8KYH=IUͭ3dWy%$qgc3v4MZrT !36W!$]sF*S[Z.g:ɤ6d ;cIZe|5 `=oq^4aQNc<^ZolJ~G9w@48I0Y@s5c?Dy^Tĥ"$񥅑Rm: S65ZWp[1v2X QIWH4b`v]ÇŌ3a`XҮ?Hu1*StYФ{SG^LIZrQAx>8?~nQ~#( [h .+ik5q`[:v_k)B#xA&z;M̛7{h9(;)Hz"3+Fz5x5A2; -K7c^r.CKDbQ@&!m WՋ~z&ыE_ 5!!uFIq?R2=a*\̦BCɪeyPaQP̦Zuh;N*ϲ +1Bp|9+sNpI׶مu"q"׹x#Ip#N l fqnU3R6Vzt!y.uv6kHpfQj]z8BHp lp'n~ipW˼5u?m[+lf};'lIx@yRRRo&=)#Lmw:#ᇃjȵ}USp.?$v|I6,}wKN:i6lFc}__? cZfRzqiBv@MkS5 ϳ Ȳ˓`8gw5o_33_䁶17n#ל}?C 䦗r؏\^ ?Eޕay/kIqRL G1 wyu5H{H4vi9˺ Xc>Ht( +p .L+.Dqzh__hz ^AhV;[0& ؒ\g oY/o&Ʌ0쇗*~MorqMp8\`Oitai5k^rިw ]Pf̄+/k(P7 p $||Ӟ {b2 $=_. eW0F@GUq 񷽼ON8B_ N &I2#LGx7X<qG^r;+P&eQ JKW~4{q +nk&{ 9\%Z'd]Ҏ30_?/IQDk֫NX:s0󯛙yVhgyݒDxlCe^Gd㛼\-iQ,Y1Gu)?~wȐl ¥xmۨzBY}^]\F[c%0^xg&D#ڏ0˨8 F̬7ŏ#WmYTs1Xc'A Q;&#t]r*fto]IjX\s,L&0vf3s3θկhZfZbVkqŎiN_m$&puq~p~ޝ,7&M{a%RѾ)ǽrOm©N+,\8+X]U(~= \p}Sq+"lv9RR[}lFb`F&߽xIJ'{e q|[g8n.lrXRvC]_vlZ-E>,Rcg 0nof9ƒݎya5\w| ef8OJ(s5 mɐ̜6(Uj#b!IFBA#%}IeFD<ѓH1CU,XQak{6N0mU*`ECk.mm XH(Rm0^s;c1m$+&`|y: pxK"PKiAmPEѺ 10ŵ*tbFf?PMEĕ-×խ"=.E ^s۷UmH1٦/cmJ0UA6Lu߯IpBANI>呶| Zh7c?d||{𡔥XG*R 12֗ΖnO߆ZtrѸY=e,`z'PA  |[Ev#ʍjRi(yvyz5Ez*>xיՕ]Bu`Pt RDٌbKʼnfl~"tގԘPpCDiY!LēplpfH,k֑Tb00naif# Sr6V#S,PƝ3R:s r LhyanY"Ӳo\AzH; ZJR7g{H^yFriR[x/9D :I-inR~6G":9L֡۰_ox5"BuD`-N2s,3[Ke5ܗ|obF>,=o"ϛ|Trfb)5n&${8c |Ah3:L[7=6:}~HIZFR^"E .YiW4, Rv)_7/biq8GQn٫٫JPӊKiq Zjv4 P-Mėor}X9ѹ Fsy[P[8[yj͑lK#3\FۿJJҗ`gzK = s}{?(}aGoQݨwρf^SZ vmI ի>D}w^QXnZ~of?tVjFP׶ٌb?=Gӑx,Lj)ʝ{),Lj'{oխ@%} ~.J_ G{WanyA ocE!IM;' SñULt@o">sp:#w;f`ZZ3%ꚑ1ASYFK EtFO,O(ha4@ʚѴFѴYbzך,ƪk+H#esQ)OBT03 a Ɉ4|jʚѴFѴkьefHFr(pNcF9ME 3Sx Hx1ĴB[IQetehQ#h5z2b皱dRRej28̈́3YR0 [l8Q4e0>EV֌5Ҍ;֜qllVmٌWPwuh:b2g'*k+Ɏ~I©O(¹;Wzuփ^.C*´Ԧ1pedqIvE=+|f: z3q7 "iGuQcC;~m/W_@ 5* \If:54f ln 3 ߃x2D޴#Ŝv8?X{aڼN>^G2 dyUd4M䋀:J=!ޭ EntdZ9=,KO9PjEVpFy: 5MNb@zVC (x.]B8B{?7L{"UK](쇗ӭli H:`jK;/3_m|L |4|n@7O߽? P-KUӔJ#$^ZDsY"v4.>,\ֱ MW !7M_\FKR84|\k0(r@ b;{Vwk.ݖq߱?}M96c[S[WQY{/9 %G!jT$\>jw i7co/6Fv]w]zaR[,big 8@Ϩ+vyE^ς6FsؑCJsx=ޱCP]Ex i)bTmۖ.-rXPTSpθ..ΥDk".p3*#: Qoi=r?QPl& .=Z%›%ES^Y2M.8 QAg\IpX< f(4:uJ xtjYFli"A7& /[|S"0J&̷zeB42p4^ja6`gq!惘gH؄HdHx!BD{ĥ Taڢ=J <l ^Yj))x-SŸHSdMNSŃ&1AcJ*cE؄jgL+LцBH4IMJɔxEi( P@'Bp&Sbj>iJ{֏ NGP+JXPr X)KEm!29b'LxS;)~^?ŭ +[dlX=@W ݔGͅZXf#Ju\Bи{8vpu뫸 ʧM.&OS+DX$X!g7?(I1{SoW?g>x5 n|w>p0gߍvAn٫{f{P3{f&݃ÇރKE-U"(ߟNpVW 0J)ƓrrbN {A(hbPH0 `S0F_~ m۳R*Ն?9eNa9V'ZzG9FtpgX#R-!MД5}Q֬ԊtLZckS(UȩS $:i!t64-Ά%PAj%nд[ DLvl"[+B%Da9<`#D<>o$Ң;t3Y$8G6!Q/IbHR"ECJDrCqL vZ¢C He/$ cG(ӄ/6I!Y&بJ5DzW*A$5 Y)o J,ZpD `6uPEpPO%L:")O)!$B94 })FH9ltILxn~{ Kӳ0~#AExme. !o=0>-DNowL%HфeERvEeJ .; Z^m!; )n1RDG";n8^GcG=`dcqQ>gfTKp~^ aCH>/1o0¬%B1RkLCDyJ5FEMR %azJg3s;r-TRucGuZ `bO :`*O\`Pefm=+n K\vkj%n`3FBDs}BtjKp)BaZ^&dEP$XҔ3F%-ByE鏑q2 FwZ RNɶ Z`e@T\70F^$|t`7(ܲ*B`c2çlG"B= f_c:dH<$"T[VꀩG˝Zcܶܩ.ܑ s6$MT Wˇ@"%*>^<aܷ-hj2ڛRhU8nb9/pnL泫lz֤EH "z_iC0l@o{Fw/rn}Q$gYwΜfr2œ4N&&T`%i?Fm>"BgAd }t?XtMPsȴ YtۛP ~·MTdcR^py\}Zp-\VI:`JqIj,Y7+ڽӆQ5h1)Åb'# @ 3s}~9m,aF+]BܨBHa38#6]B΢t:^%_8MQX/CkfY4ImbȢ 8;p!H }L% m'\}8 ^ Ґ#~}r)9e9on}8Ż/pxl%rǃƻB)BC#/co/&(g)~p) R(8_ ?-Pl>.Ox%^( W\pWlYyoe >"lj}dH d<5Hg|/R,w1ez&QZjbI`-K9쉍D9cw5nJ&ˋn50ϋjf0jr]]Gؐ5|nvK1?@䯿TVxFオx"?gjHBJ~m4)vfk֏Vא–^a/գeiPuo2RF WB%PwBnK t3v Uҁ/4B46s .{V~xd>J0|ΈJ :~MF \>F8V]Kl *%~ Á$7P;;_a<ë+<( G!w斫sOhٮM(721śpz3M.ۨ]5 xzq2_ӰcApxs^/E7 _\F6@PbJi3EKc61وZmAd>t𧊍<-z^K`.x}&c7 (}[nj'{bU,+FB zf;|g|F&q U7K^EW/rQfhbG3hLʯ`Vqx(ͯ'gq77f]6z1hǣ1|;fkKurci!K Me%}<[G|Y j' 1k??&ΚQ_b26bC]!VXHbfqILg!ᥢ?[Msh^Tn,jj~={݀oS*ΘaΘihxKW飄j9~hBt!bxcI}bMne ՚7-Gm^bX[S]n m } ՚7EP|eP(5IX,i9T1Nnkm\I og~y_ ا}H$諭cV$=3>M](n5؅gUTWSC#2*F>WUSy\TDd>)J9l@t/U"ZЈm *7MeB >8s r6PQ\nH;xD3'LE>=jr@0e|} FR`QldxsૌƉPhWuGP-_pu,rZ01D([AB<7}\1K@sUH?!cku1v3"~i'j oRpG< q- rJXSiZuVb̛RQL,djt\rXgIcB.pP 򹪥LG@5䗋Q67qXyWߦ\3YtT]"\w{E9ʿ+Q#crάE05貈 `1g*r} ů55'ϙ7@m-ۧ_mZ”N|C=oH|H@X]\˶*l /Xӿ\כX Y%~k}R6 PetfŤMC-iҔ3ف!_=vq_\;؁Wz(} C@1zf6Ám ggkrZJ4"O&x *㾱ϫ*4aaA]D' ]߷X [As]pxߣ/h~2m# R S\NMN}ӊֲcUcb6b-r[מ?ӄ34~9"iLXyOo j~DFa72zƽQtS' riEq4TrSS^wm7 7eDFB(Zn)d=f9 XHÉtdב Jvkez[@mt g#6\x\C%:GPc!-Ț*82WK]-'CP\0&XS.j 8]!\aB>O Qx*ܥQ'b9C KV{>_]WD>WŬ 3B˹T^FΞ͡»o6 _onQXm&s0n6^LsS>! W3՟Gd4\b)?o@p1-a Ilpnd&fkp4LKX8>|j)e'VK JҌh8Sg7E e*sia-b+:Xs}׉f=rjY]& #Ku L2+Ïl%GoL@L8ezsD>WLɪgHs| D< {Xo8l9ydKYMVF;cB1nqZYguTv~}66\YosQq|± 1}ǁM v7Xv+PvcvJ4xe<+U:dS(9rY21!_`2A37 =_˛Ԯ4բe VЦ 6)nHqџÓD>ḥs:p]7|H*V\:4 ?q9Ǩ TȾϝ>Pvshp%4n^v-='>` WAi;B#b[qΟjٯPӆ\750"n4 - ˑLxTJ@/̕7 4(?LE,#2Z^4 r8Zd~F,vDdU1P9|ES)+PwUuZk1Ђsp7l Cvq%&kp瓐}1A3Un yATtk&2S(_Em !P4"C;"`(dt?L%Fq("(9#,2",}>X#OCz+G1 :(np(l` J#V9ҳZ+GSwrtX9VT#)T6%!,ebl2cG52tߺy[?_7ەݽ-6QX.okp$Jc2q$a_d" #ye;-mlvq[M 7Gq}'cx챝DWOlkUIlח1Eҧr檵/U[Sn0F+ "d5zhgr>Esכ#1]NRktiaٮݲ4UL M ?Ϻ) o JeoA_9j8"aFM9w*='DF'}U3X4g+kjT\mXW((vDF]]E~";ğndو̤Zp~q;AF&@U^qUqvtǃ tBb nU9=V-_RppKf"썬^_ }ƽaTRRgA8 c_jVp $heO]sW۰=e"`eDF/Y孵},u|9{1;RW'u8?,_u|G !{o>O=_j>-/knS CqUp~DIĸa$ V>F HO刌)UyoJ\s Ox4r3c6ZkAҌ>[a/.bUĝJ$kwiQj,d04y i{X poG8z Eh0숌b5'%#qC RUX:yTb|A9uu:.Q7jFGd<i8ofd suȦnk.75b*WCY7?n ߸B.jD>O%g7q{ˬ2h*YjE<curF{eS%|  _0]J8iD qp> gt, liV֭;o2X+Ŀq'չFd4\2q!ଡ=hCX{tO~~K5u'kiקj4=^3M,݄*VL# G@)@g*]ez͘NlJ˜tf"xDH`NhDFN1l'S9^z&/03hGKs(Z^ Y={:"*Uh.ޢ qTX};wN82Ql+EhD1SvdchX@3U.pϧ Po> 1̣窜ЌhܐlfN,HTVi1B-tpGd4\ӚWBnm_B9.crIR]o":2ڢd5Dzy0ZF{n7/wfUl]C.$OHZ5mLH})5%Tkint[q`8(mᑦV F$ .3k9da䗊|`|z*n[V ۩^ wPPz-RtRz4=񲰖9N01G4>k[0F>_*r{M=Br:xr>jIppzD2d3+\hQwr@!#>ҐgixXp] 2++uZL3*:2F6&4-zXː(n[_PW9f>$Fc'ֱ<_ZCRO3+H]yG["RtNk)=zms.% )\#\ ʄ4W< `"L6t wrbȆOn-ͦ T4+|lW )1jcۀzD~|vv$%UgvubIӎrOKHgeI[}Xƅ'r67!O6ϓ@ʖ\^/Hwޢ&)W$.vQv@UgoތiE+^+Q^]< 2ôoR@i*vRό '΅Ū\oQ\ Mra aFw(R22|s{_VbXeQU`#uQx%\ 3SF8mM1`% T(D&?FAWӢj#ES@T[_N?խ#(K=ϫ2h+–‡WFj."'P7;[Gh3U(!!I Q6L>zf{\Sשd㳱:/g8q< !Ǩ 3*7"h9i.}vbΧE0 es:@N5*2~61~yqWln؁p* ~ߜ^]ҠPcd8)K؇"+BGB EA~}ͮ$3n8e3RS(SԀ ^z.vX8_]8TYs{P5 ^.bP{JX%t^gX#' 滷B5FX _F=q5RkArs dc` iNgTDɠK(qA9}\SΪÖAk ǘw[QdlX|\m h\P%Ƹ \W;)BL,W"x?y1AZ%7pZcՍ@kѠB9\[M":c<ϳ`˛a΄C<}[`Uj d )S>;wT_#r/r+MPa B02}J\gzDl aĿ0e>ߟF,>LR]"Iqoz@vF d$$>a;|~NA8lwr[\+NVYぜձ^*|:G KVggİz %JqQ˱yd-f .|}k/;FubBu,rX`2V8^C- :i 4:j%^ED܅^,o~q'N4E{ ]5FPchVzݮެ=ڃCwz/((ྑ hMFo(h=qmTVBˍWP*pS "}wGK(|(Ud Ul"UXڪBcBK9M˩6ovk nawc} _-PeՒk֥5uciK$wx Ο kV<^e?7 ^5u} HB$a_8oo%X׈BmC1 V|xD!Yd_њz(Q1z jTSCy܉!erGp c2IS˹L9[w/)T8ig/wPMM0 0.! ߘnCT-&Oc~Vΐ͞fUJm%Ӧi(p Pݨ?̗`l~B u ifEp`sܧ'sjm2~t\R 0 M,^T&loYSo!zHRYM5Sv>& C;͞(N@gWhQ%7X߳O ޅ FCL{S@L,Np=Moo`EZSo~>HiHMa(H<뒱"- >s"vRP0{j3Q+M(o 3Qi.4/ɯOcD5KuVbHH؜b3erQ²0d-x *U52/FMwl8KPKIcvρ'OcDMJT|5F4-|JWMc D%趕wGU#ΐY؏8ϐ8 {& wC% YoSB (2٫2g۫wB|dK|ܤX NW-WUŵ#INDʄP!ǧ,?m}Jo!ػucD-4=owVwEm5/aEQ'? +VD$!5k5kybZOc cUdco):p AºQ. e-q^ݧ[C᱅Dܙ7kQd)̆(h^5gM(98ǩG =Q*~U!Գcؽ?nA-<z` *"ɸϛ?UlWӏ/e~*~*x͐*CA(G>,LrZnR&IL"</8x= /Cw qw!z*L+$GbtEHWvb{PNkq:Mi !ZLDc.&EK-W(QHvM*[^(S -dOe^ +%ae<}].=?2DyVM]qH3M5Kssl*MB,hh!l,|[X̜a[uیkC-ls_ؗYYIcZL3[v}mɣmEō/D4;,Gx?4$z``d|Ji;=b*-TU\d8N8{ 3qzv;.{.!T3֊!eA9Vd,ȑUMI1E+pe<k'd82i GNiN9Vu No!`,f> {yQlF0քݒ^-_Ov9y(Bq‚'&*+\VkMI4Ю qJ8i/0oiEMLҵi !M#c<, 8,C'HAF),i9B|tmBKdu{x:Y>QژB:I=x0! ',! K$21:? ^?::GUe *BW^c\% &PIU¨A(!<((Vk.cѲMQ*7'0A)Ku,J?#E1]*x)+:̄#&EuGǙ{D2愌ȃ+E2bco^"a ',OH\ɽ5q@J͟mKg&<\J&AJ38S1ڒ`ck)诩7 ](B 5ia42p }E(k;֣(L"S*Bu!Iw} jU.J4 rsHowq)9jD&_@&MKMD4ư_85\PqJ1B9e٥ (.@$g6z@N2eⷥg ldbpj,A!ȍkns f0Cq0QŤm A]|cs< %?QS,aBP >ײGڧ1RU؊WT$ӝ"az9a>8*LQ;9d/jͲ-;1tei0 t/ٷBOc D&$htaVKK"XMFf+&ې֡k5f[hnf,{,!>Wr܈֭oBetEF P2e 'pWӢDaoh1gЩZT/E1|]wU]A ^Y2&Q{[9/[]s뽶htZ-]է`!W^3NYX)WuBA$qnZ LUl)v՗[=ʟדFuSix,fܫc~|'M 1;Dut7] Yk,PE#?as+˕<3Q]m!g]6`ts:LOw6q8-pm|lgu 5?I/ 4ۼ`^vݽ/;Ֆ%$U5Lˢ̔0 eR$Ɍ``|e?d\1 "qHl4G`XhD@m/D`TUf=Mr(E95BHD1P;u՟tC(ź/Y+r-[;{xjM?3\c ]-h.F7Wa0ɨ ]`zG^O}~ ;LJ5« $ÇWU?TWOvr= ja^WLD~`aU)2Zbpr⟫ sɶ}sJ1}{@x-.n6Pi!K`p˜!!,&Ux *mI+["QiAZE[ys*c=N9xd]<>;sJF'> *)g7Tm?|X^ {༟OӧнͿ>y)+",zXs98k%Et.[^8q$I^à+TT/qOs˯_fv2x4eA;|aI/W=H8[;_hh_'v4upjf/?d_W;Gsfqgy0}}OW؜O-xe3-A/gyraAQKXlN*MqK^V>N# z#u࿧ kքx]tL,IE)1AbX'-u-ߧ,\=4,o#vzٟM֫J85FW6ş]qlDkvVτ)OgZYa͞6Le͟W5 ~ed7'`e*ղ8 j2kVkѰ~qq2<ş_L+ﯹ^m'y|^7wݣk$[UPXYj|rS; Ӈ)1.p^PaJ_=܈ o0ݎw+ΣnW6Fneǫ`@C5{ G8kqPjI?f%{ғŸ`{<q0 p#F"X`)cd)!"=Tj&f0W>sYPoSHZ1g\qUCuEWuJIqed2<%E^J\]f[ lN&q9܇OE"]} Zhũ^<>I3[5 ҅&nSSK eN-QoC^UeTr~a[%I"R LPIq2Jpu.< V[D$Jh%"̐B0`lY{rp n)^zBoCI}rso=Q^v`ҽѾr48:6 <$D6r*pǥbԼ"K!1Ѳ4Og$֚ԗDZS(=9I"C'4p/0옉ǀl4&iIgs*kfAF[5 7aDB/ub1N^ }aYcs+0b,'X!!)xʹBe4Jx!ƶ;iJ.SrixLiUNpLӗW^DY!һPpzX1{jdH$9kg"fXH`RF͜MC(.cp#XK_xi<+"cEӡ(зc\rz84C]B`m@TC)(`e/ZEEB|k#춗Co>xsA'i: aypVnt|`&E,eشUش0y4scHΣR&AXBܢv-Vb^I-z_LoE*@(Zwf;Gѷ 9mGZ(,ysY/`!x%8ds]1.,L*\UC 8QǒEyp85u覆{gA/o. 4)p10{ٶCkws4v۱y~f]翷v_@vgG妜hGrv("Sޠ/?׫㑇dO?* omk,|\Gx68/!TݖnfA@R.T)i6v@޵PZt!w(b &;棚uvgʶmMI 94 =%fG#&ѧvUMɜA!l8~JQa ʀyF([cP}2kV{a,,F))J.[lsd'&'&Pюe-0?d^.P,&T'o>dsmpϽ6` C/h:N.~)d("āz3<BJhn@i 1Gqc|,i%[H tT+< B>Bpl~;ckUNr(Ϣo{9}|{zx*g662G,:*!Iu2ʌE{9͜rD%1!B% F0J0h"O-s4 yF7OO峣hG9xv9UPpZ>?4.O7s7NMB/+W则QͰ|,1'ރ8] 1Si(Y4,^+2'BKʣ=}tZ/!vBq0ɷmE:ee 6vJm{O |²3 i4R0ޡ;f­S'oU^K@c a)5kJrrG/`A7,Ϟw-k^$)UёԞ>:fN guK7G(]=}tq8 aKv;fjϐ"nO3GϜ) zĉS >:fNzkSAK-f,'){BbxV)`sǓNZ=FU-i9Z shgxKo0 ̌(ZAEP)Z9neO'sL":MDF)o,E)1s$i9U3([cO3GDGra8cFby9R|0|8k>PNf?`h6#NFXFeKwZ{)5 m61;?c\;g?!ٴ 1.emV;-ԗ{pGe3i"RĈXw-qHl% {sIv/ \@㶺#v+Jn+ڤoG3swΜVݐ-14<3A=l;$M$49;{fHp_rհgNpeŷI<@/sx^[~7 ,#=BdQRLi E1k:9T4ݓI3.diaJJXڥei3GUږ_ޡ(uRx7+[(g^9nYҍe#i W5 %ؗRH'3 T o&-tg8ȍ!UN92'#zCv=!v*R ,6̜kH:~*]}>ۋ*zۀ{_(~s1=Ts-)>/p$7\0MedcC<1KxmXn4 u4?< a:2FU6C  sZ QX8H'D$̸L\e@3sp4B>yDF04fiG QTVݜO՘429rf(q˖%<{hzJܗ\i攨;< s1<~C)oon1T?fuPj y,#M)YV"5Gj%co1Ǝ_R)vpҹ SiJm3WO̶S\4Dʬ.1PհgNp>lhxfN('YUBs3NGeojVAiyfU1cDU[9YHQˣA=%d9]ly:@O3[Tx\n4bP4<?JL1o8<83"Thq4̡\#!L) eaa?FͰHܶ6.hxfN/9ɪII9DZ=3"E*&}|:>J5uz5VJzd`-]:H# Z`a1ɨ{br6~qa)ϬJ)nVn`OP:b$pP|\5)+4$~~Q([@lvQbz_V ~z_\4>r7w"[{aM IUV=8Pg=^?d a)_ϡa>}ZL+À(+p;$ٍ(b3NR"K5%*R2Ӕ0p|΅.b>-,:4k=z 8!zBLSE'z}=9YlWβ4&SLQYaI%TB$#qEcps* 瘱05\7t݌|gwMvh&m)vzn%# 5׷g~]*9KWYi*u.JD'-UnE}?WL{v/ۏect~(J6;ot_D2Wm| Xt͙rCOF\B`O4ʄ )kLtI7qWDI*4Ve(Q,Vi>n3Sմu; kK\A$i]orq=ͷgEzAU8 n/t(VT&iuo=t]Lvl~..m?Z WnYV+Mwjgu6J6y~+`ƴ}i/ |~g\0 qH6oX! UX6C1k@vy FԿm~V"[inNl@5@L ^ClL)4.@ 13VڟKRIX)f3̠eMddNOޚ6ڴlb ]!FV_QTw`ଯ~ Nn2޾JB A(n%^ƅ{ۗ/OV$s+lV=e}wƈ P.Wζ/,ͨLK $I!ij͛Rf18)7'$u myr˝q+S%Dm᫱V/+ij-S!_x ^g'qOb4_lo8d{+Kcat[nNXEv,f"LYFM-su;Xi 5yHeÌe<1Ù8Kt}U]n%xiB4a<Ւ0K8bE#M̢0@\NTJybF8O'(&+q_Aa<5g?KvHǶ>OHl?_[tQaB11eHgpT<q셋0X~f2ʟ`E[.z%,„0! ZE9cY9EqNR&q$Qx2Y8O񲇂mA±|]=c#, by .)Oaa{qX%wx`n:8.rH3y"*:ȕĔ&Op![_~F~=>A5bui Ym]' Řu|?*Qlq5Rf~pin{~ BB^oPOx2_[mb.XE97,m)_*(h*4 <;fV(eb7Q&Wv"XT<I:JdaIDisi4 ez67V׳%4 ZЈrAN݁hglh˦ďq G5 ? t&xF1K"NU$G!D VRD:D{E]-=bjπ4ݔ|ab0ֺXG|$gf|rv'_[t9k|5M]OF)yJq*s=~iG{ut}w] xXF:}Q(oѥVkWњc$ŕt{RI tIwǦK] Zt8U3_wzABgfD|fA[A9Ml[a۵߷ bIfd)7tk|6%aKhށn >(9zg Z9~}%oJ6krG(=Q?#sfq\4gyQk~@i$o)hi >i"$.QTkIdІg'UϜ"x9o ^TId%D*eGvuxY%2$M1$!7M?@B ʩ=Hh^uFUd 3esmpX7=s~& V{m<{"Wc>`ʇ$wn?DB R#/!`(#B"02Wk uT x"GSL021^Zi E?@BZ4 R=$EJ ΢b$ڃGC.YAA^LDjtuw"x u`T1rHȊDڔ"9?DBZ3:RZ`%\?3ϓ3} 5%B"-!q 61sK:9\hZ*I=SXPDR&B>ZwZq9KSKWUY[J K.d9rrAj`ץB#n?DB SH8k(&K3@kd/1 2P,jk Bz 8JEYɤVmA7DB JBs/J߂Zt2bJ=~&jmt5S#KRc\(™=#?DB z`D#D̳2<Ҹ!R#R|j=Y2% xbG@PL QP%8qNHh*+z2o|^EUrQ i#% 791k)(5[I{ֺHh5_0ɓIy4s ' u~ - m[h-s&у {<10R/amiYA:l8)س0DB K5ާ:ާ:ާ:ާ:ާ:ާ:ާ:ާMܧzy<]9_/&)V_|yeӗrւ”%P=$g-֛D˸QXB FDu M땛ۀ&kp.0]ـYG֐V MgGgV36 ሺ x5N/(ˑQlX2QeDHh^(k5gP;xMH,'3"x׫=6f(k5#. J6[,j_}~e-<10X$m M.(KF+Q7cPMtgߠRXz׭mLk`-zU|Xj./햮hvz?|Е⟿VuߏSLyΩ9jb#'h*5Fm~ wiZ:ֻn;}ԌAsʕXO>M (gs*&֖$wep S9] "EA+5+ɐq 9rA ]N5ϰOC!DpјZEsQ\c*ѧƔ,bN}n2]vA]S@U"uauկ X-h8+$aRٌ2SSLqbw4{]&_! ۸_OSt1r`H3K̘\]7_^$Kfv-پdB38?rl^2 =sabc |o%#.Έ©$&$_OP"'{Dk!%2$PQZ-Er@eKL'IƜ [oY,3F2Jr<@s#ck˕@Fxoވ./B ףm^%+%jK"#J=?i8]|-f#j0A|}_m~.E.2͹J$G\z{ iWVjbQή8J9⢥Ng%:ZlYrʻB1@Fgaᗈ=lz$Z*:[-ьȞ&BDhiԢw~|C>X]4['/,.{;!Y(AlcNZh"T1V4fTGQZostM嚆LDVZq!QYY Ɖ(O&xƉzr}>|a̓hҵ"Ժ;o鹏ɏrٟt;: urz3A辵yku{9gI:Ǔx>߃xxȥHկ.Jqg@L§4JBi{`) \9` 㞟NI87$9ǎ[o5IKthx>s27\X67}<2هUi8 /3rגL}g̺OLꈼ\6YÃ˓՗F\J/dx 鷀Np K`_h~}y/lټ\C4d Av SƫO/^kcecNoGYJwճ+KUӳ_>]v~?eӗ'.X$Tt1bcf!>fUUNOԃ;;=3N.s|SMO݀#:y1Cϟ.d&geR>l3ӿE&ۓxtO~~{?}u՛ ~}WD=pI+M;7o;5@_6-t*/~{u~8z볓X֍+CNQ"Ն&r%PINfEf"6uVBiqXc1W!m ys|WdRvve ->I[GH#JwUG \qR |-x > gRлZ'Vo@%EhFx`+lk#G׫ݸwF_<Ug;2GbZ`clr˽Oσ3 Aڈ~d[V쮩5M3ZiGnNrg?gcɳ-ym5N&wsA'7wǢfcQ1M֍E@g,j65EƢfcQsztcQ׍EƢfcQXl,j65{~DO+.X7i4M&uƺIcݤ8X7ٻ޶dW 6,Nv'$wI; f;do[ݤ$Z&-Zl-Sub%țyGޤGVQ|q!"cQd,E(2EƢX"c#b,:&~7P"R]Kv)DڥHibaBE򈘺#"yD,e#"yD$jHS<"bX_QaVZ &R]hI c3D;dzS$9_1A{S*o&VSt 4&O9 Χ土O kH`yd8x?za&ղ:Qh/<\"l ZRyvV@~|?qހ R{Ga\f^>}J}ΑOruoIBeƥ ezzwkl]ceZ*u!׍d0d&\wa7@{O^]-Zk%v[m})u&7g=-̮StZnSr]\cjZ{ebNK|ѳK4bUmWx>L~˦=X1Ո{}i+ 7YNP'8ޛH3aOyY+ѥ`?Iq]$^ۗnyijsg~[lxGÖ l϶DUm8/^s#ŵM@/Ⱦeݴ7nt0҇W=:aQױO9 2wfz=.="5_]G"י,Ћ 7^Qї7yMS[umr膵3o3n-=UH|R2~X}Kص7H~ӤԼevHN޼fԟo^#?{B\G!VWC-KvlHtE-ᇻ` 9Vݺth*lI:.00.H\},]8F˶+r{k˶Ze7NAoA?O&_~~']kՋa'ڬGL<]>ya>JLC$Rv<E,#9ebƹTP0%wңx})^_חx})^_חKחn`prYQB34(1HFp#?D"g bI_\Үq0֙.- dLYl}ȉ)߬p &S ! :GIa69֝llXvEf~30r`8ј_1͜iuJ*+:# ^sW&1(-r-|*Y4˜Pqx+Hj4i8nV#[AGcpW{iѿF ;ގNKvyA[- 됾"Z.PHXj3;iɬ5] j NV6 ݳHh=r-6\d,\=heJicN#ԻG.&w&w&w7Jɩ6>̟*aQB0srɨ e#RHDc`ug{ l =`ّSuH7%6 \B#LLb|γ/Ll%3 uARe!-2.s$k.lzd]&r.S}:eq>,NWyb .~m`X&f '>/aTy[&WFT=#Bu/?)|bI5tFbV97GlS9Tg$JƓEQ8dƫ=sg "-1y[3[[X֍'>Mj~Ͼ,'znst>Z48Z[%Vlkek1Gr9_'P̂'p2)q R:|h\~'P˓?}˷0QNrOaf`Qlػ ߫|9N^4 ӲRh:xX|-Rz|l_r29pOAֆFc+sнUzEۄׄ,,'*AWYTjХ^_7ۥf' " f#pa`Zr:x7Goo7$ 1 Ȭmf f9˹#p-sj5/A>3#ĸN]G% CU.+_Xjwԕ| ߩ dC&VfEQ#l!FIm9x6 < (r  <r3wۚ\}@Co:q%Ѐ:gimvgϝw=کrsN7R7Zvw&ӉrkƠ}SK$WѶCս:PdGqIܜq5ޓBj y.5xdu6zbRiϜsjKv_81;Ho}zWDYeu-{Тvt5Q~cFu?HK3,k"q)>q849/?-&mIWQgq]ifT>N  ENLo,QX$ fؓJcU!ʌK&2S4J} xIe. ~[Ҽu}I;= CS?`20+Hs&6h4#T3L;jWuΙ ؕtNˌh ;=i u|5[m-gfCea4/=`ov"P;]dD¶wǫpC}@U+p0[R8B!=Q^n~h͑>ft鸵 [tbaG,j`WJi-V&#\= CnzXH~HK{ee /K ƍ@! &0 !}B Ak*kemT*wVE*$WѦeA$,փu6@'֕jt"mZWD7]6"~#s؊WkqX*yknMK ~k|O߈SE-@W|lR.N7Og8I-i"$L/ 4!k{d6YZgh!EC.*均 hY4927cM>܌}T8ʓ#L#9ڸ4[[cߜ1.RTbbHT)cS&Lq')&K큥H-qx]꜡jn㦀!6,!l`B(5U06S Is<+0ESAEEz  sw6  sv !|Pػ!$txs9{w6۽z,5>/zR[AJus~Бľ76}#3 5UH8r%& r"8_pcX59K.W(H¾fvf[tmWp%+ȭeb3񛕒.DpCLd3%agp[w#ú3(_6 c6&Vƹ5߿0oeqBga$īq qr$7EaZJ o/BoġFf?%+rp+^N=7D Ӊ$::b}T [8e$p)BR˘Q:XEV1Tk%ia){?,562mV\=W\c=sjخ_WzEGӅ׶yj`^,egc\g%7,"%Jє` ;9ʘnwMm%/7E+`'׭x&~bBHcgK  hO:R^{r`v`s`S-2b8F11@qԊg ]ѻ $>M}\kĮ^,Y.\Hk'HK3Mqup hnYB2b5$qHNJ{or&/ϢʷT֎dwIR@4*$Iћh4J'N2!,n׃|ۖbQNyɤ>H"&'u<3`(UN+_ޖ S;OuH0<_zi^qY*EmGCsQ&W.WLr z@U!x$xhT& 'wLFAV몹4_' ϖT-$8缔)j'`Q8\i6DD*ebQ=r8`7bL,3LGl\> D'R:5ΆzvEnT|y_JfcM1ץWDn[+Rxݒj-hƝWp<+fmU*|`5x*0E⌱KQ/"eA@xPXgZXjZ~گvRU׺AzΝNs&HRPD0Bx^zS b8l{UwLx)1s*$i8M5܏V Bh4Ø\K4Pl0-AP=;`n !q>p/]IZ8)9|O"*rqNj4rpc̷/eݹOT2LvCД!R>XWY)q:*"ARA \iU'?2hziLLᴎtZ.DpnC<%u;GXxcD%r[=8*RYESc9*#cQ1I}ؚDD/hb La=At@(3b8M8Yx-N]8iN#<ʄ_8߇a]õ>T Gʢ&uL}}Q^p3);s K `] ƝaBfJf$"P@vMF Sd\d[AvZh*y(Fb.Dɜ$Pt|הІ cҌltO4yE^,~[f:::^TC8'~ŧ30jaӋoxN|#O.aT }seDňv^yӽ|ae`3}љ37Hrs9KNf0|>bjqY-XٺffY̲Cq0'<_=lw֛48Cֶͭ.5cS X'9(>}hajfuHh3 = (>7~Wo^^_ޜ|2}rwo^)X1ܺ ߪb9k*c[U7?~9z 0,2pD\nh2 { Q2 e3r?b݄H(5!8 6X Ut-bIEݿ^o>Bh>b&z X~#qwO>چ>Gl$IN4!d6diP42((σI[KMiFC{^p^{~ 0ZRB#F%*cu-MԧeLuuU1x"7yYVચ\ S|DGT %("6Z$F hFL&a`EMIrs#K|IF5-Yo5xy~?<5GiT;TVO'vЮsp:-BĔ@["ZmZD)Xjf1Pb䡙ȃ1 @hP KLxib&FD .yr9_togݹ 8QD$n ;Gs-gHݼ0nӌq~حcEG;BG+MpeQW4+bqm5̶hDਂ݂#?LZ jn9QE,HC:\ ޚ(E"ƙ$B (8 >P1@C:㷠gBL l"B,Y֗]-֝~]3(|Z r<1nA.6;]>|Иf>~0$1rFiH!bB2ahFz#)ڀRW-sD h:aMGnIC8WP+g͎CfߠFLl:v?? Z-\v{0i(O :"E6RH-xbZ'iy`NE] 'U[z 1D.FGυV()5kh^_)!&JeBTy<_zi~ipYJs"C\m} CPmhZ)x$x4,h~HI(5x]5p>ixj Y@"H!OsqKFy" (8)!'"(T)Ɩ+FTBVxr3LGl\! D'R:5Άz`v7*/njĦ3("7QͭR)NnIV+  @ITQoOﺺx*0E⌱KQ/"eA@;xPXgZXjZ~گvRCۧAzΝNs&HRȞPD0Bx^zS b8l{uL<ۧ!2zQ"(%QS@! 2c&G ݄9YGp+.d_UXa[~@=HОszFP7% ֈ36K2px%OG9cvj/9:A'YACy{?*:t@",jɮM8^ɉةgeÌ/J9=렖\Lɼ\˵\J,8uLq1TdjvP!54N"o H*iwq4% kmdFlc>PM"1EIO#t~>UusT6ּk8VsA:DZ07v%ȡ1ws2Q/v VTEgZJM[^>^]'T +/ =‹֯xNKK]%Zզ̧4>% p;Gը0 ~;>4 4!FvCer{]1t vob\B>huu5י%U_PG|ީ ]Ǹ}Wտ+o_|zq|y{03z[i{bBQ%nSkl<0oW/#͘nyorlGݗNh5규~Tcz;vt.z_OOi4pyirD3N989ok"XLi}$-.7&^tJM߾OGkzNGp/X}׫͡[攼³cGOOVF{^G ݅[^~Q?*cF6}`2ndg^ܠY9ײ| q43W{*j5^M}׶=9=u]Ϙξۀ.&Q$3DyƓ(ϮOl@}qpֱu0Ix%B+dG&JאҮw{U1zڰy7nw&{GJEMyB*$uCliK=p,ggڕ-M~6oρ#ߋiЫ_W,ן?o<"`ݛ2zb܍ ocƔx{ӿmNvG"?|Ϗ_|9oQ-J~\?ǡN{:;O&^C!g<@ 3Nw^dΞIӄ8?|ƒ]wgp; 7Ү,Z;&>x[6약pa$tƾrfr{\0^fOX(wdWg0hM'_vgx#xp1D&gRH5ѴyI6|Jwx-F$ϒ˄wPم'ᴝ5-J=RPƽȇӳ/E'-pӮ6:/W5'4I6Ƹ7qd_W\~SK-jmv= ܫMkVWHݴ 7d,h͚GE ~j~@i8o7ˀo^cy]qf;sVH^Ez6.XG2f%9u^WOǬ}:|x/,샶MJ?g^|_WW\YW%(Q>ɚAYuQN7J]dJ.N^-х:WkUn ͐Ki6kŃMT׮%x8 $nwRh]ZhCwX*S dKڠ-h:͉fѢkMɺ5dV34Qm}V53&[kR#%$Ls 4O|xaװfpA'-Ռ ӮzL-x~h cvəzso3M\ti:aFušLu(.M0 &!Khxƈa4Bv͞`Q"G@%~vrkɊTh^T`Q2L>Rg$BާYzΛk#i9ՑJj MQ F%k w:|NqxXKT)ܭ˱'9f8:A(یU9 %hI )%$Qͺ~`!XKݷnTs`Okh^Vj!G]чΨ|W!'c#ZZss%aE6 VY0H!Qs(Gh٥="Lwڨ/  jsM`#9"`](- G4JNwp ăׂFp^KyɸlX "ˮF`yU7J|YD)dT j,Oy0e3@fih"\=+tގ%BP m^b]Ϙ ,8W!ԃLNZBLHD6`d^8x8g@UhU7:hcja6P76xk+<(7xPc- Rת&_6LzT;*Tk>Q?!\ےYo5sPqab2<|ykX* X*m@W{qٵAh_nZ;9'ut^m9{?hi= kW>By`3|YϿ3o~2 .)(7QQW]Z@FN@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $JW P]Rp0rpg@J}J Ԣ$@M%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QV!KJ jw@{%&5^ ƋSP1H%ox?qcˣտ~ׯFyr_O2t;:?:{!.1\vEh?|LNKOAa[v&\1\v%\1Z>p(IUC ٝppڕphߵQz WbJF۰K ;.] W6| IG=?}n_O_~:A'O-1ԟo~~ m@7YZnk͇kz3`?=OWUNQ+ǿ_\^_ϏXfvUhm#I_[;oHFqFRDaUDIDI%92X&nr 1FN9^;{tS>e>vpFȻޭ2.FOH_m?Q/{C|1Xau +3ŭa-*I/;[p2Ǡsc.@pb]LL<觓mb/oQzB~3FIuMbK[PįYKV_3z/捼񫟾ݏOދ~_m+]dk:<2yv.wB9`5vUVsm<@c~dC^H!+xL9KM$ƒR,. m4ߖvIo_hh.`DK`壘|U71@?pмwcsv@ B/(`Cbb!%m=l6u9[6ojO,LgDPw[:ȽH|;=Ln(oz{oH+~y},uv޳A;a ٨US7$O^~HRͺ٤d5K L؃Ё)~E8”()ep!Ҿ f*>b8D[۱q>㖅؎KOkCl!.[ስVy lhj({z_٫^o0)I`*l}._9);1%z~+lFvƕ[ͧM=&^NG9?-7ePd ˛'P.~>Mzzӧ>7$<Գ ^{bԜvzs,,ӥjß-G8a~G y)PSւҥ=3yzw2B|Xȳr';̻?nPD#uHlG~k)V[E;tVw*Bk%6C鿞xԇ02VMj$n/SocQ %,6 XJp56|iJF,ou+-n%NlFq#i>o0ՀuQPm&ym,PTO`|g$ tN0SrňX:ƆX_V%bcFA플c ߉ؠsǮ1VOm1P0q]?dXGN숯DQyqV 2!XЈ!n}P&8H:LLSb.+D,U ӱC*N¬FkDa9J/c0tEjW|Dk_rF#4q!XC5C@p&W:@ry/䩌Mzo??r>ag OdQH6pӳ&fek3/ ZR!z2-S`l-\I{ Kp|h 0Ej}8L͕JS ,WnųR ̫D2<4(]0zdƮ7or!dB0w3a0T"ņ_A߻Oʖ,% z:tss=y]?  Bdj_W:+Dcc؉0nn(]oG۽ubf'e^=Mn#9cQ}',#[ a$U}Plh.yk)FVxuKu͐f¨Ty :l]2'A?ފ@6ɺEUַ:d]5}мߦpS ɋi1O0_0xJ5D̉_rb$3Lǯݷ|ջ_ݻW߾~|w?^ ,W0XeA{1ycM2lta,dRWq5e~devߟ9+9̽Y1SP"lU4hn]iRVZ'͖ͣI|Wu.dPY⩌~]L8KXsk7?Zu tuE(2GVXaz(bH̴ڈR!7!&1/NRцdk|mr<86*Ņlj" 6ql,eSMMF{NZ 8̀o9[Ygw)$=T]G=T(מXZ?T.痎r~ cɱ}uMz=]q.ax4[8ِc@ZAL E6B#qї&9臌7VJq bEac]f4K4(ŀ"qHi/)qLW${T=%ypr iJff)@#A*"PR' I $FԓyK2`2ȏ(]?X.uٍ I c,"%evRZ0$w!I uI uI uIG \aB(Uҧ i؉DqcBNy?j`ّ)_ot:{>la<q򁮚E :Oeۻ]Zə bfX tΟ"||& Brс#L"Yg_M&>YwSM\{ II^K fM/)1@kX(1@8J~BHt* QP&u1)ϸ|f!N>uyS˨I% +R4$(fU3%)$iƙ dĹH2ٝ4 PZ)/0v'ex*e,7k4{mfcj8g+úDSnCPVdi4`pb3`5NsnY0^<޷,9ٳD^O4=t2;h9ّ]_BDrg3]iok:u2aǧvac|3.$40n!A9@"! 4%*~sMЪ&B)cIEX$Y50>Վ[qةvKTkYbd%(9dmRC)\#CX0GIV5v6'h5q<,:Ĭɇ$e<\cyUS3o+ڛN1u4v 4s+Lb!rd0 ,"FFL;!/ۊdEoӾʆŠ$ (v!rX>V.0z@/zٿ/s(|e`י+Ӡpk>) #w0ki:QQi0xԟW|g^|g|/ߓY'D+Q_ b\j1yn);ZLFإ_aPQ>}A>[.aJ(0_pIIMh!.ix1f !CTih U$[Жx>ږns&SD2%~Hj0:1!"$($\$Jh5p'"5PC#X0B£Q" cơL3%B5Fklj5"jgk/8)lJ>HU/idx&ͤ}cיx8#$)&BBc@ bdM ҆rp©;hnk;uJ[][P-dޕmdB kJuمBV[l-nQR"eIa+2:{UȑR yBr 8JFE15XJ:3pƆJBD-s3 %H!UR aaFӎlg^}}jV }NG?.mZdbVTA!%EβJ*Ô FTzǼp3@*p\1[,y"rIO$BC@7"`9R"ȅXca Vg+NpRyG5&l~͚=4munRUT"bPi:J0VV@0I| /sa)ZFщᘅr~q٨Vٍ2LGY'5:;`6)f<f&/xsU@[A H- L^[DLFĊAۺˈP>Fu&"*XH|[Qiڨf ifIuvTVF>Gw ꣃD;:}vϒcuf|*lɫGe$}qT.c1Q}w7CF7Z.]5 $NņavI{]+#ly̝)פ((1hkcaLg8Sl_Efֵ\>]ʳl^Tו|nibXTzڞQ" C1F: lp6HF7g!HyvTknr].ںa*WB˽C,d;d>!K;{: EVs^G𡊒@ »:׼TL5aWAb"޽atʀ>sn,+sx<ʭPm)_ۆ3.-G{]g;DZ)Ғ ZYXIӾ$W yŔePsGV0{bP=?|qn ?P 7#=}EL }1am&3=ԡi"xZ0wlaY Lߊ!S.4zrV#IvA$A\U<!0`ԿfxVjՀLњ>1>Csbl)5}$M,Pujn){۾Dpf<\G,.e*d-O$uE0m$YBƩlձ~ʤG|8qԖ'͍ގ--SU({7x*0vr6@m`Ģg!X+[$PV_ѢѻPr΅.t5SH>!ɇJS]6wH>ЭCn0cغH$èk8wg=vB"mfcЭn=YiWG@ޡA͌֞á|d'u}"N6Pf):j<J95c MPb>2[+H.#ϹId# R}- d NdnW.e6]MlAڦ1mRv[%ªs2|b+SH"ځX QRG$PVeoi8躷]CIEp~sܹ rmGx{X nE| Vk=|vE Ź֑cRP`"%9Ԉ]EazЈn\ PkFh*QK:{쏸= 4"' P@0CEo%9f4Җl~M+ZʀNl2>^W/5@lo/7W( a'ՕWI?048r#"\\@ %s^u-,?y(M}СvCVO_.²÷s0ɻT( tVr74hƣ{jp} Ν?7^xfVNlx?:[(ŭSS̪@}#/C~x7w&`|?j~0s0Vˇݒը<4ѩQWТl M2Ӥ\͎-nAҽ^wUzscn$/rm 㥹w_#* =;3 Pʍ"VFudXDcZ)"V"D6,S ,o$#6i۴qkZWЂة|ԭgWx}a3_Íœ!׾L8Oޚlq(]-qcխ.Kg̛nR#,{q;M77]=k˕le>}ة 7lp%0ˌSoOPuI!Ό _UgɧQsֆpaT~9ZvzVM@ZY4q*uʲc"Od[ SWRCڭFK(hI#}ҾY>_`6l4ˊI} ev fXʹ(YvCzDOsrcyB *)J#0a{fCukڇ[0?2 0G#}gԅs `b;hKy$"Y"MD[Ѷ ײ 1A: EVsc\A @?ϋ/ŧbrUٟOkV`[r¢ auf_Fe)P-R3j XFk+wi~ ?h) iLI#(XBMJ2*,<}9n}Z{4I&Sd&\&6NzE E+0wA֭G} (`gi8l= }߾bDy]/QB\L ZiW}e*wc88՟Ux^lB)PQp!EU *-f8?uy ޳**Me pKˡO2]ۋo^GƬ7ʫ"g\!r -%& U}8lYp^;n~nytO 3a XQCQK"1=V!bjcmu>B}!d =!"=AZ3E4^JJd<} G9Q ƭ_ټ>6q:Y,&WŖނlkY_(y4V\c77o.X/suew>շw1_nT`6 #Sq& #ǍiEiDyxt¾f6b~~AWQTҊvH =P߷:6 {v l HR]U"UfG1*_;EONӴS4=mm3ɥm9` VsF!C \ OU`FM2-[sQ}4wUR};~D/*p $! _ k-YV}VYN囑?/^|ȝהq3sJK.*f v,Ls6.YY/aD)3lncᧅ2=d!BBBB`% b)A tNr0I%HU,b†r)!S ~<@zDƝ);!x]f'Áj'8@U#O"~o8ixNiaHQX9E^+)uhY何a0!EXK)r^ʡU?"hH-ҧii"}"ݖlK "莁Ҏ`z~2&]Q"HVeF8LD18a99+;HZPIK*PM!M13ˌ3LlɆ>"PXZ O/FM%U_rt1E;k:Y>ӭM'8<UYLNq6kBP4;h  {+FI (rj" -SvMIy'<ٱB ΑR yBr 8JFE15XJ:3pƆJBt -s3 %H!UR aaFӎlg^}}jV }C?.mZdbVg#1BHIb FTzǼp3@*p\1[,y"rIO$BC@7"`9R"ȅXca Vg+NpRy4*F=+^8JSVQ~I@ѧ5(XY$1t HChY G'cŁfcU'D&n@ YUŀ5kxsU@[A H-gvHB/tyh{0;1EYm`}# )EbQ$2`٪#2P vZC3Ex$nD設=v_k(T#-_}y Ց7kHKAi ,iΈFe`i%hyG3eL <:hA_6Oՙa?U*\4Ɋ&Yd8 DZKف:.m +W$tz{JF h.45i%łe4-2m|҅M#Hf_oեR=o=u.7?ɑe\M!.iWʦ+eӕJtlR6])MWʦS*S*hǩQf:ayU6)nq]{ꎼ0dftƬRb0/ߍ\+hC/<*N/[mY fmqRW !%Y,f!Ynu '1beP[/&e6ѪCUDC!qp(Ԩ~:!D!e֠ XX1c2b=645["۰|h:+ 4Lf!$38!;t8S'Bi.=P,J>u%7z|3-lu8Ħ\;|>"`#1I_uOad4gq.֟Zge8G#<E2Ywz7~*=/\xM@5oob4Oʭx7)–V&i릛W5dz~t"ywgH9z'REɏhu=9hpD҄>^iKi/-_+͒+, w0\[I{{?F;?bG|q5w.'E^Xd[@/vh^WRI/R"0 ,zYzl G%m`yt>ߛ茒'LxLϐiԔvGE`.@ KMroDʵǍbѷspT% ZZ5d>ZRPG Vp).}Ԗx93Z{)¡ ށ6]0jHS #t7lpu8ȧ( fG J_B:`^em) Ӂi]IX2;>IDa iU3PSAbA1 ('@c}9l4 {bhDcZad(QHLd$&B ",iP΁:KFbR$q4(vIiGMa; #-ydGx@="qxjO&T !M4sq3ŘϤ\k<2vzXig U^ #_oy|xḧXGP N~Za˔ &T3*F|ibE*]0!S`0jve)3^jF&lx>Cy@^meea[hnA~_-7#E #N6TS'lĎx1fΨ-|cA%pdnx%Z>Y HitL*"Q* J3K`"ӲCJщMVI6F!"(5>YH9B Hws0CGHғX QF9# K͑%A70s҈՞I hLnI _p87ƅ~ ??3SR>9,cWyчpx}:*aWiaP&pH.*.=2@E]^$ iOyl|UA]lhK;r^VW𡳳(L&Bx:O&` i(-l9qN's5*0woEU^<U{CzvQ1[1M|ppyUB9dDxHʌf9Ci$&Gb|HMÐanfU> 0 x4\Q^dӨ k|y:a59Ɠ"8SGa TNl0*_ /wާ\?>\yu^|$_SX6NAl=߷/ CSfCB~rȿ.Ebt 2g` xȃ(>]y@UlE-Iw ȧtHP@ |EZk%,Hk$ 1,Y{^&H3$2@ QyT 1q쌤6 2>JcIԁe@^!v1: srsPh[ÉR)G;G_4oi)i^/'~ ;+W +[u2/bzX̀13ɬkO )&]-鑅ؑ'^=eT)׸eX' XsAy17PjTJe.i4*$:n#4aAS|ҭI;b<'-31$R>ՆEw~C0 \Uq3|^,w3[z(;hOڑ ˝iGOp}= JctZҋ*9p;K:\&&VQR?MA>MLj9+j@]v ?ĸPcYkI]y24L3?5 $|I`f\eU-]ӌ(Cɢ*c(3"3ݸw\ ]kzԳz+mNXZ3i#< 9P" a P1+97 #$ Fj*e;QQ)ıpZă]<-{ZI8p ~1#T47F`TQrAPd8x-NDhC$V61;Jw6m8PNibBҒ8d#[.l) p*,HR!^IIgk@M^_} #z!Uxd!R0}0z-a$E"J*ҭzog9;N"Hdᒋ>Ws?:N6 %ğ3Npt-%/ӘGF1ZleF͝QFGRz P+V#Gx ~0KUftB_ &u&Ӫ''tu2+Ɖ瞴^ոa6_^&ng4ao25w4]ޯ.Rp`T r7LvMԛσ5N'.MRQ|=@&[G>8)~R_<%\'w[MXQ :B0v.'~zl%lhf0qtQL 3KRmmDg7YPxד2*6Q  nVM7-3Pûyj_Ao|]g_G핝Uo~+UVd ^iNB^.gŘ9@=v2deWӷ^ZV+i?ލM?~=1v_ _'@Ig'W3d,Z`b>,5gX&›wfj ب/N G6\@< YI0@<+X#Фm9` j1Bh`H7AI(~; "Q/Tzf{7z pFgfPihZJ %6qR[ŢQSJ9@ *~Ir pFQ.*$7~,6quv?;nIo! CG"B 9R,$)A)\ aU6smw6E!,֐ 0GTMp`bN&=yS봴踤J@-' rlTF=}"Ppsqrʸ% @9("1Bo 6ٻ6$W6؇ Cgz 0ۆG1EEn"EQ,RU$Sd2"2TF.($ٝ~o;>>o/{rk2!\} !0ʉ{$ Y0?;/.]ת3N0 L,*[I/zyD1@(L*i-p]bt*N8AdjCr[WT_NfS ϻskҥB!REUF%[Eʤ ]6`\KϹ ;E8ܵ'sI{1l>&?͟6htu4g@}H+3 |I0 R!hHFd2Ef]3vq,)9KJU0@P6J4L Yd-1!C*H$Kc=}G<,gL $wY. #Ԩhc$8+Ho8D"tc-F˞zN9599bCp4`>6jU'a<9|}oTz:@t?#(39鬓UWB,@IʹyhFu}b}.P%OS N.dvfjqyi[WnV%D0)k'y&( G+x#B^+؀N!OΕB=M2[Iϻ+~>F~jnoy,zITYCʒ<1& ^:u<,9Hb;osKDV`+ c4Z2Tp ς7#|d؃yp@W ߜc#3n76770; iϿB&642 v>7}C/Jh-fRv.6^41yfD |D`I,֯Iθ%UK&nمRrdWat6# Gj>o5CDW9I1 ;2KBO̧i#cm_|,4i>avOx?1R'_M9[SJM#pqwƜ`v&9qYno?>i<0x`ڼT'?y"0[1ÙO1r=삑h0zXH`ZGzaD0ƵH*'_`"ůǣFO^ -αQ[?lmN 8 |z}Kt, VWSk:Dv6O?%~/|~,ן>/I?щu SNj&^O[^L%ꡛX?qpYR( 9iÂ2N$J̉)5,mC4t! xgѐ6vqb E=uQR?k~{˭ˇCͥ;VVx:;>y"=~}kFn躑w]6I:~Ym.NhsxTx6_ZW뻧O6t:o!{f!,g۶v>=/\Ok}^AzX64UGtppak9[鶎կjw澦{*z@Q۱buwz+VV CZUA4 }2-w+ >q*~щ] s sN *!qce+>8Al>glsc'A 8DCtLKh B]#-(w!X#=4Ǒ"޴ohkYԛd́W>^j2"mo./W? EU&FǹvsE~xy~F7_Dtq"TeK H[yb,֫>QmFpQ/:v ~i7ǫ^~ ʟ>E9pfO,_jw{w`R~NAޞ2mr.?<#0ē4 &vxs7k^E4UYB5xXT*[Staβoz9J1%0YIr'_Z&cPJRhqI )geeB8,^nWkIcQXZlkQ wjXT=ve;_3fϘc%ͺ:?0cLgXyǂ^]"J7Mj'5l_~I.θM]MR;SnޟtKL uRvts*_1WB#>7dSyܺAu+jF * IQ6=ɔLQE:GOGK= SOs@BX#_R1m0NEs%OU ۛQsT 620{.$&Ţ :mϼ0ol~KYMrgdWo0,>2̬9@ec3o L*?} FcuفV@ < U6/@(뵚h5b_ZطV#p1h!2M:FBL!d,i?<˽TK\ZHD*ŅG<& &2p!kKyMː̺GwcM_vt&!_cqbRsQmd5\(y.3?.zH&l g!:i99I4JeHB 4ΧܔdSJn4c [&sUrJ.s'v\37u`|`\3mEZi s )c ?= YbԪ6 ـS!=x^]] EZ\Ӣì~@Sx̾`cd?qs70l#-K&M!Mz贪iRBv$3lynTPHSAWY+tݫDЕAWH-J -YeVP2VfʬZU+jeV̪Y2Vf'(In ` ݔJu(nh5B7eFB7ϰ AMJ]+t%DЕyxjƁPAW"J]+t%8DUs+tճJ]+t%DЕAW"'U"J]+t%V+t%DЕAW"lK Skm%DЕ\AW"J]} -rJ%SAk;yy0Ba'ZWraQ&ژ(cgQJRFIRG]:IڅѲ!?w@[x &'C̅}$ w׎"sҁM7J܀R6iUb^lwPl5W7nJC2,?@AdzȪI&lI9rsdd3hƃN#O #2(:>j7%{D.K9JpQy4 .&$xN%Q{Wù%_}6;gtz\Tʒ)%JD1)=ф̣He $WC΁]_I+N4qp1)7 a!.K"y(2GeAX93t`9ѓ"!w4*sR}l8jOƣ`ѣHv̍١wG?$i"@Xzc7)fp+mG0֥顖}t$JƖP!cfI < C`QHJTcdbUnb(ɊRIiZaxL 23\RgA>2B,dZʸ-h'>'?ٟSW6NR#?."~krqaf#eq^"7>9oOJtCajc.k &RzƠJx/\iDM+Yv&Guā\5g\2qtΎNx'< ˣɔL\3 CHrtӛb<2gYåEO,.i"m <<}[d2|͛'L=v=^}kFF~X6ﯵC''ӳՏl!BOK#[sI|1~u-8~{ʓ e=Ѽuoּl1үVgW~j/r 9<蟾ͬtW3I\L8p9t'e7Iw:t6mͪ&N8fbYGb'Wαwl]V-ŮuR2p!ƚOq,]Yrc~{I?ݩ[:5$:g{wW/Uן޾x-_} Zq~G30%h74-sp~ˆ[yZ-ozX\ڷ*/_ ӏI, 8\rr2L̿B#';1'̺4!Fu5!4 vxw1]#onW;RkB@)1 L |6 *'q_?u|o+R< 'H«Bs@٥ȩFU@BFY {PL:I۰.{ȡ{̯v`R9cRk‚yH2geft1h0{X\zɡ^%w&O$kv>C5/=NpF/qFChuO]Գk,dY)ŗ`VrQk-*JUB$K+\kOp@J$S C6L')NLG8VUQʷRWX_?9QpCI#sqQ93hGn>jfС, J47$ rJ*2\!*,BEIO`=*R7 \o]~"V  .S WC)]F.4~źvv_b@g5'p9RAF@mJLimL{Z (BP& 2Z4& (Dǂ՞-G4: `;!EJ!lsOIt0Pg gǘm'B ^u7e?\o/P.ƣ⭭:\.\Nx֔|Y-]WݻO>7[:(6X9ou=Lh7n> nnfwO9ݕvf!le3ݻ6zj|wsgeWZnh9Wϼ- qϾzGɓfR]`/ȼ9U9gNt,UvԮkӋ[IgYOdZmxȬh>$ AHQ46JVd3E. P9#f-tan8U iX@sE0SF)!Fhh?|HY' ٢ Ru11cK$j%r(RH!B!FBY-W$Guc6-UMvvvJa`AlyzX'UAY0NᳳѰeMY*?WIJoVqUE @+ 3+P^mS)4FfbƑil0EXH~۰J:tқϕIUqyGEIa60Nvsil "Pkk 6/C;<ٹ/xn`*=H6rIY6F*xvֿټSxW&.洭ﻹ:*9)>Nz%/wZ V>+R"?~};R .ҽߧy& (L?꽚e\֣_я}.TyEjw2{";؝$p!5wһ5lx9S/+fL?$Ҙqˑ޻롯fKQI.wst>t S~*^177|QmCT& 2$%mTtڰPkvV|i>ΓVF F6D)A9Ȅ$⤠ґML,WEF[O[' 8 hA䑦30PFcΆ9h|4=Ɵ9=>9o0h-HDp<~2'8db☴8Nu׿ A쑦Xe9Ax.Ő] IyMkLd,N I7A7IA3ym!DNDa RdL  &5->8CVOSY*U9D!xB#Q[6!!uL'pUHʖd)iC(CV yOQ&@bH5f3維դVq-||~){0-'2$D%YaX'Y&0Z0^ NfHe~hns \|8Gnb nvs 8Lw9=JydZ!t4ғ: ^kc\ O]->˶ag,m3$N6ޟS&slquL:׾Oq"pQAj2?*HhC0gO+t+8hbo@Vn}X4Eg<}zÐBoRY(M?&z7A'l,gr &q?.m[sQ'os"$7di|}T[V^ *KSR3K< VKAdTIiI|)?{WF_?> nLa`w!OKkT̻odEY(,8̌$#]MrYTxלC810w"sj\WR-c}"fR䪃|mk^g8r\<() z{d:(+,vL "{uv+]GԽ\jװ®'.ҞȎm6.tu&BUM`-l,f]2m~J)pty\W0f Rxc*1Ľ ;:22-Tݸco[ jU=EҪYcpoU)*e\IUs.ǾnYn޲-Ts։.}m*>4^4}aYawI{/@/t:OgH͑qc/w*mBbTpr9NHiVG&]`Ֆ7Iʃ70}'{26"8Ў[^gQ؊\%/?mEflֱPYeQ,wNyGN9^N>:)~""HP]+h),!\[խ›-#\[[~g7v7:~+*~U@lȩ]wPOT}5 ƥ=7KiP^ϺF:qҘ]EWc{󩗊i:35jifRWWU~ґёkQc`~\09sH[GSbePT)nq_繄m((R/U{Ҫ^ޏPѮ&h^6M%^j9s ?z{g{-" N~ik&& 8Ɇd#>ףF٣BY6mm{Z1wRS) cq,)^k;։du6K53X ^^~o6-5r%6+QXӜ3yW:=Z nus4[ wܟh|Z(ڜ-rm|Ӗdyp_r  Nk0TVb(a>Ʌ{jBҘoY H(򹖜N)bxЄ7|L &cU?MX>7η9{Wy˷I̱[wd|h7W{.h cNK߇+3J])G]ZkRk6,IǏt+K=W&:B_LXh.C_ZJ tX]dktR`e8t88mpU2.}z{4ёbdbm}]{ j3rC kV '$r}*qZ-=.H^`\@*ßRRJTU}7WJ%K !x5o|=v~nT<{>L5&s0Im $ޯ)>ߞI6CLD1>GX8ϕg6iT饌ڴ x!("gІG򾋬e#n{к1K%V)~qw;ߜX`=/i D|:щ\dPw`,^:2}` mPr4|{}xux?\h8)o>LN 1L20{б$ zq+|q'hE:LHj7u~GH8^Vo":[ u>ܧ\>[TMf׍aXܜr)*FfP™8$ *ǞD?3߲ UQjUVgj8WγYafw;E$:-,q%~B XadV!\*$Q4&Q9h P6M`q"?t_*OT4W}'CdM2Ll ꠼.0@#"-Yg!\b1\h[ m ^uN:/,LIGv,TH<w&^1i0)R@*L# PT\\@ %j 00 3ϙ?luE߀[تImB G=,m<B丂IÍSpcB)7j<[֑aXFw9XjZ uQY_,t~BoY`v`6Sy:y<Ftj~\^3G hq3AKy| #AL0Z{ i0DzZ!9r.Z'D )ID+)`)uTb[fDdJ.#)BJ M;ֳig^Y]]uХ6*A&k̬``D5F))rS2L`X^NwW1R> u ;KaҧH !Q B^L#(ѷ{q(]aெ>v<)~@'V(4"8:!aG2e{ h;1:)b@.%6!`B`a*(,<OPX/5^#ϺF]ߖ}j&/@0-Z4Sp a>RA06TS'l) :jlǘQP;bLD6@s_?MŶ,@- )TD2'Tf]"Ay"c!P àP:l# B #y oJ#VbM"^`I0 ")/5=kjRA!"(5s[E$ !Ԁ9H*Fr&(ga9$H"zsZ3#gr}[~Wƅd?IlK'1~^* {(%)u'y6qWӢ" 쏔qbJNo!z,Hpf}_@4[XJݯqjcK5F"r|?KA)،0ǡZ[ LJDf krTa9Ȍ\ pgU$CaP$X&סWi} ٕ)HqGfUjC''Uo\0lI\^{tX~g`X#MP(9?q#n`T6KfFTݝ-FXl^^yW\x=W_fK1 />sD`8x?I?ޜ0xjԺ'Ɨt iFnE#aaɋ>/./&W2Y+A{kXk1t3qzk6: >`a̒c j1F^ѨlbP^9~xOަ_O^x:{o>{ Hؚ߷"oÃ;]}CK攳Y B1f_Oc7s8"ҙxp%#@9@hb`]1P3*b.dsQA~;rC]엲9;.}aB9M1i TQܦhiO.1m,q Ux ݽEHRd4@f{F9x (\KGQrl0{6r,O3T›i4v6;=Q$$'qcU鰭u.ۮG|Gg$0.'v3'+4K)BJk`XܣeQɢE.uZ9pnssJU]q◔ Ψ:~D܅'ޏ(C{2t^`?"%+?~A!f񅔗F0meePB" N )@F)*P-m*Tk畵_>1CMn yЁJ s4*L% b._jJyڬ{ȐNAюjNv(v ]uq9tu)mCWx_Sk^,?CŪ±3d(JqQ=**XዺlAK)2di)JS8Ld tNKzV-#w3jgIF%4=>XPy) i:oP%1d1=FͪXLhYw~N0SToۢTz0TZWE]6t8g!.MI:#qv: J'C8PaSTzP^+h#c0IjY(VKmڔƨdb@51'(w[ "RHF'%p F:l 5c*|JݓԖdtW/6D s;jVÆ-5&m5O"da9gbk)ÛI9gc-|9-.+AX76+Bγ~kuf/.pt1qǾ>Y3Ԝ.:t@N/NcyE-/ ZaWsl/Znm~$ۜl϶ h3_qBP\L! m93Gt&ModS7"׃iR$^;ldqPppWYu#HEvXq|<տY<_>mHlWl%8 og'ǻۛd;7FovUEڻB&:].դG~*H@Aa)$LwIo!Z橏¸*'ӚoYonR;/xbt5ZSS9@H2!ǂRVO[J Z{.¡5}r1 ]k wc_.j@axpI Y},W)43A' Ą¨c"^C# KRiH*PMEaJG"E^.$G6jk۠G.:{F4f@5JIJS`) ΃*¿Gu ͭl6hiQZGiΚ@-6̮W]v7GhtW%. 4eF1+Xz @=&9rs(ԁjJ77o5].JSrjB!o_?U+{+ ,y"UDdrf:I.c8 @(7 mKcIC۽0T&QPRI.R3Pk͘D^bpyYπsdޝ-}u%"Vŗ,)f,dYJdmP YRdyF7aGR鼀P$q( 9`e֤+x3423znIf}$&W7pǔ@_j5^ۛyud}׮!0F3f2Dr]3ͅ3j{sbHe1",n:9IZ$|BD J.h͙DRLIːJh3(cAV hKNQh=~TxJ<6y-An&{(JPs>TO+U^5Dzx 5}ۛ6DEf4W-~A*Svdz ^>2oCػގfތߐ/zozfRf?\ B1I#u3[(JߢZ)vBӭR~1켲*V$ rkj=" -(kbE"ƙ$CB6[6's6kс4#fyYnv=7f_%q4Ҝ!)cy+41HQ,J΃.*舷⭍@ի)3F,CM~FR-¬o;w̑1hH7fVLfg\>K?W-X5AK+P W8pt(0UUeհC[5`V RiDFʹZ Ǽ hPZ. 匵[jKIznl $It$AUI !=#:j'L+RۊQk<=BTej:čKZ:C WFݫ\vϑTCvsō>ׯA~~CDy <%eIS/sz_ipmW!+᫽yCP FpE5S,֨V[;u0xr5o pf2**bƣZ8׿ɧ6QVތpm/ȬxvERԕTE1SЮ޹" I(pEE +;bmXi]ڸUKqʖmQf,n1ru"ur-0DB-ME-1A> F:Cdց8)2^zYڬ*Cz߯7qYb|GE*2uwӸ%>,t|| B2^>m ž-'|16 {𽿯B-U) FycW89q\H1NY 7 y{;ğG;|&DֳAqtC͉Tg#+?x U2u~/0h4 gî2p6!Urg;vu ݅LQTBxb ')y=Kg$m`H sZuFQ.;i{i+BΈ]e9 ;h3h;uvԢcW/]e.2ڗHA/OK=0 ]]pB5Bi+NQ/ [4+pƴ$Z$W\I[cGvZ2h~aT}X\Λ52/Ubjo_uzzz;^T( az:(.ޟ +bt1 vzGmf>OhDuެyNVTx{7/[fkє10 nf3.9ziݍ.v!MOm}tmT#G*3g1kpp?t,i|*#:dSƊZK9= 9$?XhR-XR/ kzE{Y SA7tpMsY8 .?|O?~Ƿ?H/>-j? LQbAj$AL?"@2Ow?7%9nc%B 7bʅi >t{YSh5>:$تHR 2cr4lYD(8X " 3NmKb?.ѿv堕FERB#FŤf:y@h`QMV6u. t[kb2Є[oTnys.횯$g/  VgB! mI"|! 342=o!S+\pQG*Νɧ4O.zCAifO38>[IGK}fr'X^GQvClXiYv篧įL* IE}aރQ.;:*4.[ˤ!&\VGg)y:%*EKZp2vUMjҸ^A}f(֬}81#!>||xtRӶv;(lw=>9~ձ Ag+D0Ra]s8>}G=EG:-; 20*2 ޹p%3T(^B~v^n{#+~h$[xBsz݃gCJlL|cZA GJ  NC Œ*Ykbʬfr {)eVڣ=jveV2:)v;nRG+A'SLa$ӅHFzTw[+ם|WW004.R)݂e)UK`)a\(JDPdPn)H J\3uQ9m^X"%)C$ 8 D, 2rVkHQgpe7#KS%R霢 H8I'Vh7AA iNƝ`Ǝ4N GD8qQps.RIi@q*[0~3x#0(O@\=T5C3G+9>{ܺrG|l  LRr~ԣav*۹CjEg&Ƽu9*-iVwciy^;B%ehtk_U\~SǍ:ʫBQep??znϙZ_MRmC*B3G !D*O*듀Cu '..ET߼m%K8Ӡx’S'@(e#1Ȑ$A` MbR:)[M{:CT.o & pփXj/NiN+#7/n C8 c|H <J0VV$>/sa)-# Yq8*յ5c+6oF[TZ d@*hI((̀ `L{m!^0 {7"Vdik#sC"`'"|],|-y娎4Ez0v:/@Y:#R\!ghO<`tHGkrW-KIXumW;&edQߕ CD =$` 6Q1"&:ǍOݰR a)ޤ7ǿVrQtjqW_sQue: O:j>UEq@N:H[ hLaS*C鈒/]fK"nYNjfQY'A.LgUIaw 8Ux,7 Bb:4䶾m?j"/l^-& wGկW6OQw~ҼЛu 5[l<Ԁ(^[[ԬE]@0}Qݸs\7SMIĀsU? $L,L^Vz`RGTRJmɌ6>)Dc2g,=V2WUߕ ^W7v%}ƏfU"3`R%F48+H-^ d R"Cd>,0. LPh) lǪ_N.r+elV.0qIÍ уOrǃ./hCJ!{#]ljR&}g\sq X'|Jr}ܜ|,'|s}N9>I9>'|ﳴ|s}N9>ll"9s,g̱93rX˙cii M͙c9s,g̱93rX˙cb1%B1+Qn-2Y;}>ku"f ڗ+V2-y<4hL*_Z/ԩuk݁k~-,q10 0 najhSϷ]b[z 9gY5~MΗ+C0!B*Caj3jX$"﵌FMFS)־? ԿKUqh~&S7xvש.zc?;*6^\6\ɜ;gT+?kmJFHqYH1rK%bdF/cLVkKZ(}t|dk2N:|79iouz{4whd0'ZK(Iz yI=RC2хTmK{y}B9@Og 6lw4H&$ XG09Pӆ]qe2dL+ jbKFb$S, A9p0IX2Ŝ%-KZ}Jڣ=⦢5U 'ЯTVuAbsdE}xQWlq枃-(B 30\kK{T+MK.TRJϵ#r´w;^*Y |Q˜{ꉕ> MqHXt#@Fp-S6hך$*F|u D0!SP0jve)^X/5^#ڮktX!"?l)YgͲ\`]a#E #N6TS'lĎx1fΨ-&J$Wx7K z8^X HitL*"Q* J3K`@$a}۰ĂШ?x<傷uBi+łjD$'WVZf\ޚkkڪ*JORy\l 1D*g-"!T)#6 +Hw*hhI M0,5GIvhQK#V{&q$LntmC6Uq᧓aZg`j %`vޝoXդև0Ƒ2`S7*sr?Uz ঋ?! gUwǒ1NnvSqɔLޯIc@rM!bdw~< .w+aWiaP&@N.*\?]bu{vpvx( iO S %#SEVoM:9LR0tiori{aX#Mŋ3'>SF/#L~*.|QӺ'#r\ySx1~n\fLFbAu>הzI'Q1ͦUы`A^%nwj$55EȬ`/ic µtkACLatކv9 Z oc`1贱ԁeSCVuxܜpG_GkpU_zt]/K^~8<$X}*aM_+!(ބ)ˠ &{^e(`jS@Z:H[ À,/RiJG.x..jZsg^5WimJ\}Y[s΅w!2uI%Oaer *+D=,eYLܵ@Ugnt5h;JMX|W_V#񛼉ʣ׌A2h<\;YIs$lS=Kβ|!RK urq6vVY6H*ʤ8Kk"- q=ӅggvZ_`ty=[كCu>U(.b,tIV8"f?{ ֔u& tҡ碯z:4//W<۸v('݉Fܿ+m؟yn A88 2X 5FIOKn cOhxGFAƝ"dS']8]: eZi6:j{Bzݹڷjot7P!W5ү*euA ayxoyQqTv~Kէ_W:Wh`1Qż ~fv3N· oPO?nbg輪[8/p5(䗤'('E|O=XMͬ?I5TO3VIR̼s]QX$S1ƝB7x8_5&q%DKǙ-%%L_ш;l-NvYIlpLZ jn9ə^" j&&W]!RjI"$C+g('툲ԛ(qbWy)UxzSwqZт.(8]О[P Vd9qr^pQy[aNxkݛ*~m|ݓ;JwP= Izl'ɴ{ 'Z ]2lUT| _ "-^Lqئ忥e5 FsNC{xژ3A0 PJn@؜f uM2Kv*dh`J 4uY27?W=^mRX푬,]s\ha/twkVYmqaƟorܛLW+`ZrM0lޠ%Ӛ`+uZ :'LgbCټ6/;KNJ.ADCb{%b (1/x4+\k7--2΅H©Ih}b)fJ"Omy6ߗBǔ5KhkGj"O"Y v@>mڂ>Rw\SnX/Ral5f֣9d V}dS Zqq,<҂F0vrhUCbfͮ ֗!u3 9\;jh ~<.~1HN+/A-v0;Ӕ#a~}_Jt)>7W& νCE'>~_}z/TyU&YRM~'Y,wI䜻!ٝܦA,q4=!(s~:ř\AO gjCL Fw i /FU80@JfBIܗ:+PX:}F]USsb8AE p/]ej Ű(pd;SrD<_[xiĸ'PWS )R'J3^%)XH:(U"#DE'}.?V *ؚ*(h&ธ4n'-4%F!UVJmJnD[ δ,SMRM[9u Lz¹ ia12ui%DŽz)#Hq 9%.UFǢb`5g(o8SXO 5YYri0v7,{_x_|yIW^4m6R y1}WdF?[q'1G `U,~NkE]qϔ̻_F8M5E%v>&JrN?iv Ey%p]%oř"٘nD鸙\TzlInŁuy ᬮn9?lToHDg}VKeo6xQsN #.;6A=qO'W0jwzT?][oU{3/YfQ%10 Wf5kDp4ͬʢC;@HSK[fXs346T~Fq8SPY@OWm:gHc<ꢓMin#GE1=T  qCg{#f.래v8)-SjQێ HQHQ$H%D6 K %J{4raEW7(NING~) uv1$\\p/xw߿ǟ?V?+62@m ]sɥx f fJ??(bJ̷iZ8OwտOQާ=^pPS>ƻ0UE? ֹn[^rݿ/^W<>r3#jj8ٞK%YC,l>XW$iP8`!e6FD[d*cE Im.}qm5E}Y"lSΖ%0@M"/#29#Uh:UuPj| k|[QqG;}yxwj~?>?Q^ȴp% \F#"$e1c!,J9t"b%FD( ֋構t $ S6WȿƩ;B0JFrXȞ0]li.mTM.vNS}[fM Da9h"W) Ff=򤍰))dk"˾ե9y`#ZfgA ڨE1@|X幡ymj„4 :}u;j8 "\5u4\q&~U, sc s<-4G=FEP,YkptTl畷]T o6JOYNa‹kj`kU~o:]` )52MYd)|:g ˈ`JZ'|K?2kQ86mr:EkL`,Vx祥"r[>P?Q1A8fJ3}| \"E-[U6uVnWk>Yd2A[ЈI wDP{\wޣRӔaM({ݾ*}m*k jہ䄹UTi2K>'7|B>عϷwmy={T"t%KEaQp&+ ˼Bdm@-u,W:YNsu% U'3jcn>e$z~\y pWK$YHtLqYgԋڰ?oa}#+5}]=H4-ln3'WQ3=m*i؄~?~a$D;XW ؤ&iZE9׬3js޾ǒ2&=UY/ݷ5 ]xԅJsu1:ikS?/NW/<뤳 #CQv;筴qRao ʇ(E,Ux42ై1#9O*ʹ1hSegρY/1!dK\kllKtbaM' $%Ԝ;VlD+EϤy $F)f\_֐K&dx"$5 (JVSbuލ4yUc(ٙ.s[OoaXįܛ[F+\J[xHogK9}}(IsvLcK:͙0dUlPj (%h7B[\m}g\B;tvS@s1r)gSO*&έYLCOsM9M7Pw&7ݚM RO7]폤*9Pg ߳z@"ؠa2mw9i;|;:g(xErסt Fulxggx(M.qf'lP9,;>rl >}`8m`4\jhVSʣWR{OkBZU\k9"*|n[n44z 4* ȭceo ?%M$KXsHew#-yt.7g08l&%+d}gl1$0h0[&IP!z&ghm#CrHSɢLi^ M9;7Ɣ,+ijkLR&z˘ˆsxBnFr?O5sNje,TQ"KxKT\ʢ"ahi+}ghy,CI< $A*N:9Kmlrf(.#j-5z&D`hfmvDȶ7+uCqjCdSo׹{A,KTTh`}FY=YK4 XeV= uP;deoq,]t;&ݤɌ}@qsDZ 4#=KLD8&$d%HD@Fc,vD+dxZmf$Ё.殔ֺ0-u=9fudaq5gVF%~92*Ŷ!!-%@uGʲ\~)g2tdTBR(;&Zͼ`Q}v PKaJa֕R$P YINڱ>fܤ`rF{SN(TwR =5Ea9]r8 F ~xDb› %ȓ}C &pnH hDJHIv.HIu)y"%HHʘ&Rb&RDJAC)i"%MED5o"% J5&RDJHI)i"%M4&R k%ME5M4&RDJHɛ>s?HIK"_0؁'O("ۣx  ?'[?o}`VZ ΘV[;ƽ/>qX]3CQ=7Q¬›ٻ6r$X`a 8ef7 &GFwn-[v Ml֓?V%D #BSP/+FkŒ,kRaRk(c$r ;\d()ZE߸ Ͼۣr;a*o;|,bD4`AAy:nyP>r),iݩvΤ;lڢIXx YN|xGYWyiLkS0-Nh 'Bk492(.3hozA<0^oN'a1CvY*Ɠ+e zuSX[ATb2*Pߴ.Y }J5tRzc |< A8w=&xO#2Qۄ(ɄH2҃4yOs`j+g!:-y"(4M ovtu@z8rg&.]? ?wYN3=K30[Hi8Dp9$?rl )7Ai!$pq >ya{GԠ+rŽ.љVSl~t}48{o8*))K6\TzlInƉzm|vXeu"āK)r #uA~#oOM:9Yl.UJ,C}zLOWfTe5y??֡w7~74)1Kb.8g狙v57Ddռ;wԎ`V5Н# ~aX0vss|@0 g0b躝jp,ͣ5jdzdk -%.?O!wba0I,L~nTxq{~8ݏ??p>RfN@9J,G$G{C/CyR,=aXrq,ZQ퉦z7o9"*`7J .[-jE]D_N7bw{}2e^c?r2\Z1o̖r\w 3+&Sy{cAӃ*$ʼnaN2h2cr4YDVՁG+q$cT63%QѧЉQ1i礳YDг4y4D@CJAArh`QʦN/:95J2~tWdnۍUܢ=kV3H܊oLyN;웻pݨV̬*F}lb6VQ eevYAsc˾'ݼ}ۆpܞ!Yxv0EQEMc/<^Uv^mBƝ' 5<,Fi"s!9y}EEoۃ"Ax\re8g0~9&c[pBBi'^ ;('i8J:wﬥіh$l ǿ%Ep,h ^kbB8¾١8ٴe;^;3i|qE=; b聒$A|IMP\y=ik1 /d;,DU鼕=>~J*WaH\0*O>Xzֈ6 7B*Ipy#'yo EC; ¢DāZ~׾o;%l^cm-37p3. 6gOd.b 9&8x; io>gb 2_xS|fM޾wW b~13 up Ͼ |jԶyG^̑/jHYeF"/e>-r߷[Gޜ,"oB}/d%P.zpl{8iz;vk1iS{ ֡eC#Vu2uqzo[о=x'8ݯ?G5 UcG\kKR/\ =+ey]8in߹ޤ+^^zؽH!7Tq^u9ջS$z>D\BFV8 2m8qQps.RI`sPejЭf =+fQ)ML $NڢeV1'8Á,1s7rvtޡ7|> GɸPq|9~nnvߪu3t;'O/H󔠙z^%p@ @%KKRp:I.gxi ߨ)&-w;RpV(3p 8ιRWjyWI ާX{@zQֹWry|#1~9~G_ŒGR0JM,Њ1Ne%9Pf taT&ggY ޱsk6w1g:?~0?Z] mzu|y?=׋>H.wjͻ6ȗV.nv6})I:D! 9HZ$BBC"FF2)Zs&8S1O5 b,9A]`}MVA2~;!x~˟ghOrwŬ2kaY)Iϛ6ƼYgfOQ1OkbXһj>NVۈ&ڷ< l0NdOڶyit є+Rܝxt+v- C=A~ 3Uxgqs{KMۭ{.!'mFX^&igtɴ کɴa6=fTlLY=.yk*P֙>b}=jfաiؕZv@yUVԬ8y{ E !rEbJₖ4jC:c攂 REĩ30!$a~uy(eQz[-z.3;=M/|SȘ=%pVj_j_j_2\ j_oJ@ ԾlH},Yj_@ Ծ@ ԾH/P/P/P/PbE}}}}}ڷP{6jQxCV0`UTB9St%^ƀw Mo ,4)zg&z&Xƃt<UF`YB#Qq(UAAGjTG"F4H@#G3uQ9m^X"%)$i%Y4AdH}m _NFp5髮.ZKB h9Eq)c˄!RRzdDRvҠGf@h&8: AxgKPMC)ZEϼ9;"/r8j2pncsI1;-*"tm=\Cn XxjEhږjWD Nj5J5x VH CKAtI?{WHol.[s/V I;[T(UAa,i'Ȣ4Hm,` "df(e m| HS`)H!H`0J"G몝Sʰ,Tu|rs8l:]mFޛڲܗVŦ-v`Ye ܲE1۳bvuڿg& (쐒Ye|;˨f L&d8r Ȅ GaBb!&&dH\@b0.(&(T)ddHɓP) %Kfa'ei0dZRRȦobSF6%X6[!moz|L3<r.\] sn1Zp 5ذs/ƶ~7 `zz x~x6->,c|U XqA]KĔHuE3С* =`P#c٥^Mucz:EٖxМW{)Q@>2ʢ9P S uFm&IIӃ+xݪW1DNqzxZ~}DтUE,d4m^w*m~nky26RgVRo*uJ7en*?WچnڂQKt* kA*+u٪You3[芫UN8k]JP±:?\ފsoe:z҂QϾa2Ӗk /'VK]CRk3ʬм+Egg zL}aLBIB!"eU|>] ]/Mlk 5/+^6s}^Ǎ\e|yv>YE.l2_ ߏ~hDbochnZ#GW,7KM\V?9³YtW?~O ,?*4C ;U' &,?uQГ\O~1B^G-j ^B2g$e䫉L&)C hvlҤ Y&8"g|euϊpN*zv?ezQ#SdC$DC(NhF>X VNHdMweleطzgK3C^sw- ~떹bo9f;1A2z6fSs~3f'S&;5IDo^+4$g=ͪQ/eըjԋY5[JFLCAS3IEClHt^FmJMHC,1Bi  `P I=?Yo50j:TGGo}j_{ۃQˁ[+QR=,ָHpz~bL}hrQ<6FCz­sr^M/'@$6rrN0Ω(:(]VMHǡ냣*GUNV,x}DXK`m3CB(xM)dT%,Kn]%;W=m%YOZSHS"(ld AN[WZ6[~Ye1ށ&wX^Rmsq爚=DחOT:qY-aD`V9'bs^$B%Q'eR .8vjf58ؼY/gYK,FkeH -䙞FI $g|cFZ{)2)l "1c;ܠ?)X0*ƃ9omƠYg=_l}Bg~IRf-:Fk;#Eϒhr|G8@K^%ӯLl \VNRY_L)0Q2_TeD(RaESLla Px;>еlQ>yYoYD5P)} &U.-JZhMQ֌&yꘀf=؁x?>{$S9G~caj P*[pJvv^区#/z~,׌D|1s`=Ptͩ&Đ0;o,a1ښQ.f`vcN/;QcC7=:A/Ub-g6ҭ\W3z Qf C+Xmy8LCgjX":͈lb.]-È%x'o%ͷn5m.9wRxƺTz%啐aϓ%vg_kمRI]4.JiT+I oHL76a/pm Z:{SeB:BMtٻDY/ eiMXBw$1ե=!h+]m⇳պEWkۅd}6 kAL u_k ob]2ob]-cXWicE-bhw`^_ꖻy YlͦxeGJ(BL ]QF%h~Wq?֛Ay}d  63@ t 21O" 6!i;}v><6g{uú>!RP ج4RlV6ۀEC9)LL8HdMQ^e NcJATE\(edt &Z8l9o~)A,gWQ[~'5C7ή6y⻛8l:_-o?`fK}|A_ؽ5uܹy=.bB~^R1_ݤDjŬNIiU%s-ZHS!–g [@[dw-o{D-860y#!dBM)+MEʣeLbdlA[p؟ܗ N[04MeE?DspɮvK4Hqϴc?ӧ:>Z}}UV5?;`F@XQft(X;L܌JmS6G(]Ѳy0{m;Y%Q+p.f->j=sp>nZ%2y"FX+j-A:' RtÅ%!.77GױĽl$660'HFQgȊ VLƑ\2Aa(Mԇ RxBIkT)%cvN: hL @ϣ!JR C|EetL`zy/-hՖijam_xmُXջ<ێF^UXh_\gX:^Yo֥ͳXhXJMOTt4 Ț 7#"Dpcշn#"VDal?-tJqNFjj!=W{F9e}'Z=dv^Y첌Yj=j#aL'4' Z"s>ŗM XՓy:wuGNc 8a#m%Lǃ0OM>_oV-w*>*4V*#/"^.g/"^^/RѾ"XOP(r/}5я.RpM.^Fdv;_؛1J3:c!2WLUvlmSM> ..q󹆙SwJF%bW>\r Px?my/c0)(gf"88?) $o\&uޅI= }TϺ%(@Z >)B)A$ k8xzŤthWYK{Z !*'P+\d[B,CQR^897WjE6Mt!,xW~m1W;8UoUeG (O@Vt1z@Ө^)A{ hdh|5=SYom9õ}`Æ7泈EQR$P}P*Az"Ks$4;|q8S4*I5jn,3Cg4dQLt󜒻ykl T|/:xM@!jsJiI}F'LgPJc)|֓, \Po!;D:2e⌱"_eQ " rQQ8R .Cj-6zXTB&pqDC]' ޛBH 3"UA &Fi ևE 9hrA`IQ=`lJfr/G<g)L3cm4@"|.zq"?{8p n L/@uS_eEQ<*~H1YU[S͈.p}A/;{ҸҽJHo=oMya]/7C5 D -}_3X ale*+o(1o) j-J:IJ+W?{76їv߆~ xZ˵oc[X 0;iVXʶGZτUGZAd_Hg9|~rza @* EtBH8B*ixmVψQ=zyuafr\0B.̻ 7")E8D) DŽjI9wsJE. \J8.pr>v$J8I!d`$Мʘ24>D,6QB$]\I#x:~1/oIz4ob>IʬΫ _t,/w&G9W]m&# ^Rr%񖙔E1BEF[ &=H8-W؅C8%3xC-@rR!Db:aCcҫ\^ΖKE1X_i+¶}fr6Uo Ybz"Uc8zYd[o?mnmM Dx5FjY(|grd$ݼ.ϗA{sqV(\3g&?/<~>Yl[ɎCb vH#z#'[h";r.F(?Fglw0jN ŽA wQ N22p61ɕt@sg-uYvuDD[+d"I$2?KX p\rcU^F51tR"t~*w~..D˼=8-w:xs'̹gRy4 b聒$A\5IM`OE5q&4wk$౎ zN919Es#0EAoq^6yE0W˳I5h d( rp甇7~;LQg"U S|S/u EǖӍp#᤽EiNO2 A3ӧT}\G.B\8|.\+FU~#?/~ʡG:&Aئ¨= N Kgٙe|rʘk+fMQ1{}rz6x2MgGdAӳr{+#U3j]N̾gYn!]# 9t5 FaVI=pf'YFOgI稌luȮQ;֊ZKNE ,8$,|~CB5JUVVM_7;tC iYgf 3$ǟ>o_ۇ'޼?̜Wq~Ăx9ir=1:JaڷWM|=m+Af ,Cd9~0a]D ]Xl7bw9l3n%w\p[b|1Yʑn$3&DǫnPVPKy:M%*i'ATd1BcHМ>؊ȃ\2N^<;|Rh,85\ԑJpsgrEo((,rHX\#]3On.17L]$8}l0¤G?e8~9/vjtK[-Eש{tL0K裡#ThY"4Y-iIHQ*PIbk/9y GIUKYKP@ -eNj Qp>SR$@][o#7+B^,mm xeG\<ϟbbdeƝ`Q ^r m=ZX)$ZN *ɣ.E-_hF.vWpPQxᎡBhJ^VHS9ə EmNRd{'| AzNh`| wV!l >R9$ɘXX .[Di!d{]we# 6>.dϽZ4]`>Efag),IT:oe I&a>X yx )a WRgTBёܙ:c48(yEY߄&L)m $5o=|>4?K @L>cEP;񦂄FiOdSw[2x>ͺ2& H`..x!*ѐg$4Ac [#"*eƔ1,Unwȓ3HА|̶l1iQ;F7Bl@$OÏs"v= y ;XY{nWѥC=cN9990,gԢtyi oڎ!K X*Y3:pI3'n Z̳J8b*k~-rs847)gG_/ZZ~bVڤY5iR14))iREu,>ZźcYSi{c aWGt@"S6^ }@lza*gTyOqO/^n/W."/5}a<΢<F>X0 5|~Ϲ7y:꒥s1 Uw &a<.m]EV`ɲk}oc)kLVW޷ r|M'4fBSf"gZ .WqtL1 )g B9a|za]Cp|a-j j>I>6g*^ic'=_v/ks=3fϘc;uϡ |uz`♙bW z/ -jCQԶao'-I6θF_wAUx)ץS ?۹w[/If2Mq$ThE'{LLhX@\Ty7NF-}L`q!hp$TE~8ZD5,мt11m3(lh!]{+{٭Ν} TiMxtѹ6+btnFQCA*Arz?} P*e0e*LNT^pW)`\ӿR}avvsf0 VҷkHR Φ O2&m1x-bXLPйE#ֱyjϗ*9KzHP&expHQ *XK9탱2=¼scϦjk06y}l/ξb4{mF1I(T!XOBǒLU LP }ΗLʔՔ{3,өEM*wAxWx1-a VQc^h{#S000ٌ0`TJ9iF2YHL)Fe@σY76d 8o7ٰa],Kuf1+D8 cBkõ"u3ǵFv=)ԥ<ÏD8{iv>j}md|:Z`HSS}Pl8oNw\E xY!T!yI47ZURhڎ4Ki;T9vrL @ ^&n#!Ip;ǜ poc$m$]PfG*#)W > 6IVe\謏PFQ3:jDn F.sK& g=h搜XI.,/Zmxfvϰ bzzjH~$_ Z+T(} bڈ9ltw==ҵT:()똕~ L ƔSG㓕 0gi\J91Qxczg !KI-hYĖQfk옠yy(ײ΃+Y0DΤCT  +/:[ s&gOL=nؾ[fbZ%96\RHzѥ>|pF ʏ*H7BUšuAqmrRdBG2kE2Q$iol(DNi7&EB}N!INx.(&xxJR"&*9!Z3R\s($rl󤹂"3IՐ9oF8S:Rx .&6ܦɵ 鯯??yQ^8>eKߚ6&DJp=Wm5JaG\B?Ff&N4䵳axg^owB8gRz0  @0~rNδD8qI7')'Vgb!zS,yU^IKB/ɍ76ޕ=FB>z|^ FSm:}n׮>FF_6>^ ?rpYJ]р޿f9894~w 8귃[f͋i~MT=~Ç׫b4̢ kb.(_ g++ʮGous7Z.E}5q$t4 oFaVFWD+XVHb'c/MqT6dӨ s%vbO )K>5 <()jf_wSE{f Gǃ0>}?Sw7w7@/S:R:he6Rdhr#=aqT5|y-FEs&FŬu&@M"/aJ9#|2w:t+e[{9-rߣr=f; F;.;m@n;\YyDDh BK@pM /0 "V? 4VCl :%}7huJP%% IaDq)x:gE4h()5'c_%Uw7P/5ˤ쯨U]L?؆!T9Wa2fc yzsOITǓXe O^qkm+GEvN"Y `b~Y`bnI:>E]lɱlEHGR_]Ȫ4Y :`YP+L9&ک{X0o^1Uac65?6>J9(cfgF#d9'0Nsn%7jron:ٱujQzG~v߾zR=j-(C )J.s;Im]Vd5  X=eCѢ+)"Qs-b*Z*V=9֮.O*Z4 +֚8kƧf+cW]]xRpcݱ}/Ӹ(y|ta<<|0/]d˰XHr*58!Ǟ[DOBgKƶ[.P㿐DJ6Y5D d!9+ d |oKEDyyMW\Ek7[}~ڃBUa]$48xeMQgj evqTV-t@$"ǻ{T_m6Ԁcz(:@(ˑBrJ%- cII޳6 H6 H;i s5OIJY$GOڟLcךS{z{ݣdCRK:=G^a"wxF}=(dQu v`vh렠!>zUOrT8CѫRYIr^, OZI$T{(I-bŧRDSd^$EI9# ##O' kTBrRIN3q8{_t_/=*_ymBf2Bbvs6&+s,ТI^&ZbQb#8p),Qo>E*f3a(.WH{6D(k(ݝa' 4ƚP鐢bmCP9SrXW9bTArg\ ]1( E/{%!{#.g]L5D"z\XmYvpX0v{}X{ե[Vzۺkfpzq =A/-h"^̯YLzɤ}uKr|0`"puvd` xaP<&11[iÒz0;Ք>'7n%=/GB6-t\⢯?_X4n\܃-A6JOQ#؀)‹0Gt[qtL˫-.QzFtT^,h3Ye4evL) g"F PmJeH Q(Ɛ+(D%$1EG9[i5yq玭6Ɂ/4{_ z9'hLq#8>Gr}qBPrOʓI:w*=UZRCr+LѠ#~̔?7τ'd3T:B1'`猿;_ߣ1cbh!M&+803Dp>sR*)B*a*4\} @4%iE&O DQ 7JѥqLmfzO[x֋%W_~0U S/^rmwC4tͺ3?gXubHS+B:Fp[^ vW;#gQ0֣o/?}tsu;41{L'e'HR=rjݻ 'Ls?]/ur_\zRΩ :(]@~d6u!͎!V ۪{Fnk{&v[SB-}E(N$\OhbivE;v>UPRTn;&ze7y=ԒdcbNN ddYZuUJe:WX'J@nOPr ]K~\/ ,/n PoN&|oG|­4[jitNPǟ*Y5~hRI]tWإR]*C咷@;T"c='c0j=j ch:![ki!].4û$U6i2=W~45 'Z=Qjt3´/ͤ0IyT=}º1q1 =TQL\*g,o@ZTCx`RW:*KҚޫY5רB=\+0I?oosէI2e˗{9Nr7$ E_@ ;|M ԑ+?h $E"uK6!ښ%DTHjk~SZ<:^mErpፕ{;bʚʍ_[*GBUyL2ʼdr+7.V5!i{46IZH%҆˦%vЍ-XKi'CN7<@ZugG_c:P/>Q~7qSdzj23| o oftӑ N,3mvG]x'΍*yNnڬбQݵ+vKu\LitGݢi:dzޖ}slRɎJvպۦQwwW4Vy0LԷt?nyB#C܋8:2s΋6-wnjޡؐ~e^8r;9qr/Cq4[n͕FD%]\no ŬG5NzPƎ; e[?nNdu~;Ż#)~Z[hl ~G3z_͵5o\'7'op0/fyG9[I>.|Ƨ&`ED% " g$$7s-Kf..G0!ӶPY!l,B"Ѓل (/ךPeX;{$\de˝>tCڅjiV Z|I1T$TL&AR XN9Bi6,N\䤒0JL,\U52:ǘg%&XN'1[ϣKN@  eRR5c֌Qʛj+㥺6]xR!SXnpOf_h42߸ց$嬔"b&-ĔYu(ْf,L$as2u50"#MC\Lx/jTCPFmԙ=f4MhU)M{[c8Ǣcڶi`k,ɬ H@2^ DҤb"7A#Ao8,Kzl>iT&ce`Li$C(I`"K!j>$kAɄ@fG- g p|$IZCJIKbIXTҶ6J ƠzjAIV"L kB`С3:J[n30ZcҍmOd{2990CBGNIOV&r.ɚW<2*̮K(H1;B&%qQ[ѲYjugϸ[mzL`%^\ox,jthy1,"g!*${A GoJ$v;{G[15.>Jȃm M|2P1pZP1 l,`$ɑ $%dv <=WE "xy:KXߜ dAqmrHF2'Rz T2'?5ִWؑƤ /GSfȄp 3gTBkJMfjTW^Dy($rm%jR}ؼ-w_]n4wֱ3.G;(i?yYf}$' "Yy߻w]2Na~C2M+_A8ӈw (.> Y8' %80] ll6'xuF[~(Һz'ŒwK6Ҧooh"c_|,k(n0N_zfpz5җ'oz=|N.:hv)D j40gr&v`''.*#? hYz(^\kkⓟ~4\~= v>r1sd04t۩_M-ʧ[ ܌Y_* 4fb'jgwmmlcmou5Mn+Z:+Ұ4I- &yP9`pwǕ"lTњEeA^_rӿ/?|?~#pO?Ў'9i6vA}uU_3743xtoa ?U!> ~,ȏUT/;ENfڰCƑWپ QXkBhV" ,tg\[uvp{br1#B /`ȹ 6Gijl$8;`;d!eB6D],#L&EV WU!OΈ&2q.3[+WΞG{SBw=~oTJ>FC|UnYNwnAdS `LV_^/㗄$\.߄QT DZrZ%,"!KvcPBYKM#D#\M14Ad+%rq-dv,x'&E!A)-0} #Yșh/j-$\II<b<;xeP;{Z&?fBL?ԇRu׋Y6^-2 Vf߸QZ'  fNDŏ"_Z%C:Iʑb `L4ip!cF, B)1ʒL̞) Qy אyh]uD kӷtX jfP$(WƸ0e,nd9i4&G>hb?jvuYe7;"޿R %^xYq[W #$)$puT B$O9كPJ75^K?8$=&cp{q?,q8 E[f}= ϖbZ#F!Pmi֑TW~:UBASHS,Zp$뤔)l!yIB1Dy!b7J9+컛Tz]ia \ F{AuD,dC(^RER o֫a0GK2rM"t0QMn`p; EE~[מ>z@[k޻AC8p[8P>a8 ^33O\힗~:xIa1{0ʁHoy,>k"[k]^\?-XK*D6@]tuuOZgֺ0bO3K΄$:%B6!ES%C7Z;B,V&IvϴgzZ银_Qd^-wM>┻p5;Y\gNə~e$ )+$вҮl u^D휀7R(ڷ%H*F608:|$]Z)5ےJ7Z3[[Y@/g/f۵ ǧ¶C yaOVV'ou s:֢}1280V`0 E}WQY |`e{RG3JgXnVjfsٻ dj()r'LV1x }Lӕ ًrٷ{gUɽHqVq\;ui[H їf|]]4i~h4m^}ires2urq; ?%mѾz|P~ 5;pVt}yT!QmNZQzJo`TQf9lmXLP?rU`_LγkKyVidMה}6̊$/m*K. xƜBu#,"go  WhlqF8{453'˯,Œ&,74_|<Zˍhp قXm s͚Y.;^7}~1Ǔ!IvKȭ-hJtE~nvOnzbTKGoEVW7$ۏ/w49noZA+χoon2c3o5^|uGˬ>'bm멹եY'@ٗ}V9!= 0O"oxnC(B^;O웇p|X.xz"K,0>wv\٠EV9?O䲻ߎ'iIvrBD6lt=Ӻ[{smxâ=Xv'51D +UBHhlhT44BF\z(^:%IPEjet @6x6I(› a(>.Nk U $bДV!sV8@L*Jk%.o3ڳ"B\[*PP1J%dӐ FJmawO8M )H5m$o8b͎C~6#EChEF !`53pl q&U!ur=$!EZًNSq$a61EbǡΝUj"6JJ q-w n+iTa.[crȾȥND˺YQdۋ6hkڋְ'Ph_D۶hH<ɤDir䯊ƵQ$Ϋ.%X:;k@p5i1:"\w:,xĒ!`.̦šz(,ԮښOհ{ˍ/:*b0%x:LRF=ʫdb$st䋇]pDNE.Lrl߮G29IkHZUIhl0`qu2%X(]5B%} -hʺ/`*?;~ Zv)Gx6|(suh~v22\V tt4) B%d DR$GT7BUZ& VQBQ"XEd],[0ԁ]e"ߝZƫHӋތB?ցwZuo :> uJϿ5="L :x3앎fboyo*΄l2\R~k0PGQ9QEPSG<LkgZ[#p Ԑ @ yt!>ZtS:_36gGtgUI>?)A1Pʭm'u.܌̯'ov͛[A[mԕ4(swj$ 7vIu/?\èzgэ;Z^=Z\^CkxVvn Pm!^QHf]tw @.v;(sqq0tsttyA}̿g]|IS E1]qr]1_Pvܿ*lqR(?E D{S\TF"咅,Κ "+'=%BD$ʨ1\BiQLlBi Hod@I)I…PogB2]\u&퇸u/w^R+c!@)'&`;5zZ,xQ{15X'u./` AX4t$U6I( ^$l(T}#!ll{,rFKEdRxJjVN%.% $fLw++HKZ^{z`^7wb8]LPiXv8os-܀#Ukŗˑ[-eWECo)K\w!b2Hs&0 g8UBA2>^'6ӻחlk ρ3^uLYɌEVŐlUN)zOX@Vk# Gdl^Wg`*bh57Zz4.i~ȫ!.)kUs)H'Ψ-Af %fPhz>n~䝆BHJsqlJ!D 謌fI[d\ɺd(SUW^8{yC9(!xJbj5&)U]b1De!7B9t[v&+꧔Lʦ9Is qF,YGI%KV bvCɡ,!z)_u1Ac25&@G:W 69 H6Fd:XWcdJ IYl)}&:Ihk{u1h6s md`JmmVn07wޗCki[c7ao&ꭦb|=~#P .Rk,:VNjT@K8\@1XW⬯ONu{;_/x[59ŢxC^ SD/2Rd~QH,ǣ?w:$^?rfy~4+}M(aYbAA~gم vUH +9Jtϫ[|[_w ':`R~P D/!46Vad$EhIa@~w0|PDoa`JRe3^y o]oQoE$U%C4E5BP*yPAk>'(/9rQ#oS55s8QKr5%d,RI 'PjL9͜`.WSy^q=2o}s#7J!YZKM OOz6F7gaRmkt-R̰CbhJEն[ ilxda8ɊpkdIӻaIdѢlNOc%H&1QypHPKקh7ȩeXn9,  hR˛v[-|]Ny#% mUtŧwe@,4/֏X&ӣO2q ˂Ico2iE~\wٻч5^<2rPswA'zPW6~>6M01-8]C\R3.9_Wc\WUή6 ēgq1 ϙzʋ?u kT,IIRENԠ!tʔL*Q`o:]L1_qz|4fBC=E79PL |goFzi=!Le\ɀ!ɎJ208e?{sòx:'j4 dfT.\JJY9J!cɱ60D%J]uWԶb@%gT!F'Youl`49O6"ݓKW[4>:It ҆} j;#44sݕfc"=Yi{sKl!P35ܮ)DndDC`6"N~}u[Im%1ܲzeLQA 6jmAk}V,jIx6%I2h(ҫy\j]\{L FŘc ypAuFMcfeꉣzrXk69@E|S.˫-_Dk+uBqŐs,2,Z. T#`7l%xowsA5'#ky rg-Je".n);ȩK9XyB.0 UC~/)OU ep,7fD@ ZhȤ8됰STY(B Q2FZt'=@CԞMđ 7}ZTqbu hLBE˄&(/вbcU*ϣ͉{<FSy& &Y6jAŻF keSMuG-:YD۽mvvR+7z-7wsPմFLecʎ"#!#9Klk#3JcL Qz\028A ո%Nj >J3 1.gؿld߽mJ)UrՓ,ٲVg)5h` D^6ĠchzF6Ժ,ttte:gSzG24;\) ̇?q;^W6߾y6D?MB[C yzrIjS%~zrx oMD:ǗjG.o>oYYBtg'f}W@W?fF~~?9(:!9x*Udڠ+ zYVfC uBQ-$B"61Z<`B_=1ڒ䪤z8ceZ_\NLu*ތʯRl|W+cte+cte1+ct(Օ12FW]+cteѕ12W{K-`2jgZ܂vWZvߛ{ ls.Ӌ0]ȧ t!.Ӆ|y|') ##zH%|{/v6ls;﷼oQ ߊT&h"Y Tp}NT#%G. ƥT5鵠*C̚OUG0RI 'PjL fPp ޳ m\'4oLMһ.4.|={j7׻盻/s3U V-~ 3d3hW,4kmf%wɗd-.<م'd“]x t]xY“]x Ov.<م'dϓd“]x Ovo“]x Ov.<م'dCWZ{“}7k“ Ov.<ّ# -#vͨ5fT׌Q]3kFuͨ 1-! zg6.ѵY˸]Fח33sH䝗ikRγ7CJJH2y1}hu]]51蚔R6XR:"=%*#;blۥ@PPHS&O_ Pk°1;k*1cs{2{9nS{7MtGܖB:g|ٶU(DP-)ƛdc ^7F.pQ>SmM7溼0ټ1|~lӁ^8[925cp\u1`e<;xD j^N 1PYYŃtٛ2`/AY+OT `<eVl3hIY,驖;ɣ\2P%#*q)Ee)%e0Kwq^[a"= !vTꔃ-%MU%uo%8:|u1ݹު2?|zc9Ϻmx۔^|jm@a~E'Yi:_2ˬrRjkimAs ѓxv8`7O[z޾4K** Fޱ;pm^87?~Lz_yUZ*1m$rQ0xX3;͡2;mkm߬oOG!< Y(+'aFGL-\|/ 9va)dx+G9F+to&hudyx{ .JT|o o+qZژY]#Q̎fqvhw5qwL rfRd6r$hkG<[)%UHlԞٿ1F¸u$7v0v01mRUut,]fһVXxq^tz ?CgS]lr.?~}O_?˫Xzӿ_~O:XR*Bf2_ ?u7|? Nʧma:}Ǐg'k8~,YWV\!BwU&$ YL#<່C28dܡeܿ\,oB; pd;]Xy v{NK.q+ ,볰3d{2e̩jQ‚ŋmEIx闓 FzÆ a y >Q6L-2{N}_+j؈2xW_V4V~>!>J1lM<wY zttޓn^~x"qfGؑ?vAwoHlR7VoMܷiz] $D$T IS9Mz f||^1R>KeC  rqFcs0ޥDJ'ReLJW3ȦmߓaɞX~c?X*;`cQ==V[n{;qN`'vRFs\nW|?XG׹ 3O2O$ܦbRddz& sO>erj_FPC*P,RL^jj>fI2قu\!H͒LDjFp9m 'C/6g1L,N;+./>..nuwvҳO[ݯ܌o^?m[jD}\\۩ Xo{^‡w}r&Bmq^~ewmw׽h񃊡K&v/{izw:g=~j\v-'m9Wޯzxkc倞ܽ.wtyg>d+>2gƽ=WV{HlN< Ɵ7M|s?6Q>^7ZnQ%* &ܳmue2rX],?/ : 0E?Qiyc/!匱0D\PZ|0y.;gݗ-6 W9$ۚ 9[U"=p *K`s elbZ`.J8&SiUtә ag {- OO#- y&HN#z$P3͵j22qD L+yY)Ȭ)G)#P+@/ΥZrK sb{L1Bf.FD&f+ P!oB闕#Ը3f1%$&K5dUl斬Tn^JRYBǩcZZ.Xe9A&e#Q{!MGCΦpr&\Pe%w4؅ ){޺]'8 lSfx/ry:?__x_780*.7zF)S  IqRpOY Xf @yYY9>ݹ_8}-׏|ņ62X|js%C^9{0m~qH(m,*wM&{"ߺiAk  S!$_8,K]%Jկigȳ܂m'_sZ]n)VeHe4gWX.5rPv >:5M&crֿDʱs#8si%|4%{oAB.*[hJmx^cS(%:%\ f >, F?NO򖠩uwvmw,mME޿olTZpi!W(JP f JLE9 -܌}- 30<0\]^'4  [sbF (!P`$Z1J35*MjzUNggzAIG\``&~`[PB@<"yEN_(/orwnlmvz;-sIxb9 GZvD<>fKOc%Xj$e$8#9|S}ܐ'k3TBD!N\^R].%}!IZy O\N}4@7GFoF>BJm@0f ?iɤN7&Ьô(/KjEg#C'C)mٝn\. 4Ļ}݅~a< +X H*\c ^XiR9 .8($=㫻+wP-*[]+[ޟ\.Yb|s"7ȫwthEZmjiSgڦ5j+wp׽kg,BG#//3Jɶ._6,Rm)g,jb  ׸$ -B>qK&3O7;6Oϯ9po;.Ŋ&TKub1y%zR2%&2I'LPpá zC@^O |XJ [w-^(Ё {]3zס>shcv@RoO&yJ#Oyޥ Sw)#)ϿÔθf|eMj0CN'LOT1OX'ѭ*!JO'$u_cbܟCD:,} ]R̝:>-LvAF%fP&H"5#\ Q6Sk \-sr|Vن%b$멟3Bcp9@CPkY}1©Yg9%ul$`}dRTFN_{诪-PD *sa \sSaWQymiYzUc8NzHOi.J:%ݙ. U Ӥ. T]s!uM82 ) c!1-gCuX|-e[{IkQj#78L d)V]0X긗e iT􋏋N6zL|zzdq;e^^ɴgBLEL,uRmMXv4t5.glM0Ѧ{4aLiDُ^1.cA4`w,9([9&z`+1sRrPS*kD_( ^K&#(iT+8AmQV,Q1J\QVTُK~s qGDtf,":3#⌈/vC(f5-yRI٧`&T1"Z\63D c1!AjAѶ3Bdkm#GEL,x39Nvf``vW[r.{[˗deN/E٬WddB O&$B 5[gEq^Xk:[%e]^g}tXsqz-k~gݧ &GyuO,PK[O(4@uۨƊ*J++6bdv~Ǽ=HF=Jg[J3cS  QD1++pp БSG㓕 %0gTTAOeg\j0 -Ɓ``ND8pq:g2V?bRzR,yJԁi=7~쇏cCi0){7H_O[oj3R'z׬NG/U u!\LCaB[{,gӻu.ҿ/i8-Q`TGwGҽiF[TsƏ a40}Yt:^Ho0vY C{P7ΛhTyG:`X'2eξz|f?Y9[luN6WZ]u\v\CJÒ_TMf7̽bzܮaE۽t_}|{??=i,V"EVmV?TŲP?U!>~ C =6?8!'7_P@!HK#$9ͽ74 < 곘MMm fd8 1C9/H3\^2) h5x[Ngpݛ^]`>GR 1cvu, 2%d[D2,-Ζu%iLD^DÔ r*eG dtZ9/öޓ2:~FK yZf+ג˱;BG=rmF0WͰQxu_>ϼ?{LcEKZ}~Cb8dgd$hTWޢJXI`ِ'qQ?"}'O,R<$O4NZs? 4vl<-v2!0^WLV>WyLZH(8p)Aih`˭{0n /}~NH#Mgh-@R۔ XAR"E}99l s%/;>lb 4r-xkbQŀ!eMΤ7IjK 0H{1.U |WˇwKGՌ71/\/繵/>iWֵc>r}l\NgӈnA7 <=9 9ZgAjJN4 4=AHV XM.Fgy~ ɟ_T y_y{3:CR|,m<Z?zUJ_*Co?_39G׳7^,5˛ׯlԒg_exjU0.!Knۇ|M!OV6J+tVsU,G["&$JrL9,A@Ia!=ޛ9q$}ׁr4--ez#Vޗ4M\u_h ><!Gy&ˠfoh@bѭczc }ۼ&]4Yޯ29bJ/%O7z6l^+|N?;@LMg/:3Lsf*\l1p;JɤOJ=P{DOZy )l UT69F[RģHy<9-OΫVZviŹW~}Ju\y3VYZf욪gUGv%J)Ձ R'3> 15QJ/ipk*]YCGJVTrNyVWW>ڠ楒p>o27 qM=g8t>%Gp zbjλ?/%<<1p n彊g6bu56qE\%n{rx-K {Q* 3\W2xb[6{u"+Yb9ӫ='HA-6>Q ^r m=ZX)$N *ɣ.E- &CgGó~,sNu*y$mDJQ: 9 2ȑyi")BRzRrw=NӔ=O< g[2;g5&0u0{煥0y9fgdB1PrkB<EcJ$SdeStYR鼕-д<֫%v?u @D2N[ЈN 0(,F2{oL#pg:#HS2=E~/[׾Ӊe@[ 7Ӌ9h ]8uhvzfph g{j7T9C@hE(Pw0b{y2& H`..x!*ѐg$4Ac VwL8r/R(L +2cʂԘ\Q[*;Ak$h A>f[1iM Qw._C}r.gBv sy]B ri(njKEsrPIKfI<(=Kc~Ӣr 8ցKIVg>&pSֺ$S‘Rlm%z g=. \l{[vO1 k+:r6"YbwS87VdcJW ̘&E-ˁDkEnHw=Ow5@;iF;0@o@0͔wVcOɌF3mCy;%8Rhur6r4OW/^l6y{?n9a6yAwh=v!oD)9 LG:gץRKi8qiSqF.WێiHszy՝z35u5ٹ:`DϨwDnB62 ?E`Z6a^ c c c 62-}MP9G_k)&bw͡B(#;iZ{y.q-y/ŔBhdHDƠtqyi^~{tpgdl8۝ ; ShRgc}9]_~lM ۗx$SC%Z4"4"iF,z;_m*tAl8(YVRɾ>"TXo3#)s@5 sH69$oCa<.;$tk>wn}8!zCGO ť׈t :nq\31 ^}#>]57' ]܄dA3F_][a':YHXi8>2]֗i[`Sl&Dlvm) ~>lyMcnFl bcߙyޚڳ,ܒq6ր!K b'aHҴSMpHD9P= 6=I\MRI1h+&(EgtL06tFwkMv˽9m*"k]z*ĺԫ%!(?539E r SҦ^ڄ[1. o%Z /,L|x-uȧYvAZoBD_Se;93Կ\ \s5{86T)&` z0Y?00Y1(! &&*&{ҏ Ě#{Nջry:vNQ =ݱsJ}Aa?=Вy=Lcgے:"e;.X Ā>;Jt'n]U`$viڥnIt80_.%m/xA -E;5i4a`8a#/dyIZpO6@s駠ppc-8TFrzD2U(ƙZ1L1{Qi' =ǕVXtB)01sR!LAJLJ2GǓq HS"F YN>,-Ql9Wi1S:"2Ql5PML;jV,yr5#Ijb+dcCM?2R!b3dkU$KgfaLy^:f?ashk<$X+:g#Ad]!65jݪ>}~jq`3h9xӋ&us~hwc?>/\_O˿~ LIIٓ02LBBT4L $).ȯ<g<fgg<@\;gŹ)!)c!"@Ĭ8V;&:h#9rs%&=79h+SZz[Rga<u.sN3R=VO{3I=B\{.sz̭G k h[e- 6p^iQؼC/6DTiAkil발7EBM6#1puTLTFR:FkV Ρ&|/zjHwV>w[8,p <+fF̒ztkgW}.v/LSG.<&a9<9u?w^7;ysښڮ^^bh狯QWrJ%\)Zaͯ7N~x7Mna}tȷ7/{~^:p)YywS9dKy%*1$R҉"Eky~y k(W>wT\!:*.|/uUZussdи90u^u zn+YQ(~"SϞЙ2M)Y5Yd~vkˋS﷿ꀉ, U6Tc-T`k[{:q < J=Hu祜cojfOdMXM,O[M,PPURϹы(xfK ؀;{DlktjɜjT/ Ij4u}ygk?E`ZXKX=f5h(T meZFsRLfC" Gvހvl:sK\iKl15fE1(~k0qFZ9Em>)6sI+A*jr<tbNp>o}@O7yH=Ocz'I MpӤv^=\{4m0/ϓ+ͨ=YE{x#xqMEWj T%kJ*7էZcAI y~ |N@zwNRɆ#QS`:Lǔ0fxrGnnB^FBod{(I.᯸H@Chs foD+tU/]tͮ^ P]NѲML[۵Y$k7c 77}gykjuyl8ɿ"̗m|_!dgh%%'_eŢmiҙcd*֋U5Xem`"{CgeP̩DL0Q[cJXYLڗE)10qn@"L)3RYe;$Nιɣ"DV1te#U,6t,R1=\vʫ4mvWQmh/zϧ=/ˎ6Jg#):^9Kѐd'd!J%Ш;o)`EPpOgHEb1+).uFKĬTU89ec6f,V3:4B -"lC'7',ҐC%<"u H8YKW lh @Ks!XY D^kc$-z%ΫIDxߓ@Mg9ހO0ʓY`RU"m DER&#R3>@V/jUtI$Nt2D%!h),Ȇ6h&EHe#.#2 ةxpiE% NEgDH035s&D`f"YDKlE.ɮ_5y,ķT߀rRbBԿ5FUb~~* 5N.y,1*h$R I"$Y^z z/̪qwƅމ >W28i͢LІpj=׌+o$wN ur~S=nnq]:Et(YSd{svc۵mW@h}sы/$b:'y~5LJIK-o Xq٤Y/SR΋,8asIEւNB'FG8ee$ёz|@_9 zDb eHч8rYJdRlpN]n,T/Y9%G불TG謄me ēŮuD|ϛ+mEt=?7z|1m쌇@M\ ڤ$iCu _]^?Ac.2^wWjPvu3k{ZNm_mԲ[7v~M;GlJp9ot}͜7(o\v>@xth35oK\Lt<~ϟ;}Usu%cq7q\!%.@+(Aqa\ؔ 9gg3iFg\ -4jl/}秲0ݶ0Jo\1uPӆ>sB*8d[d!YHlfIɻWx*q?6!$c;+2h[ϑy%¨л,cls$9@` t2.VzyD&d!uP)l9qO.:Tt].\ɭ'*ԸL NzNowU\yd[)"YԚq1#!R2pR2Fi.y)TښRY)K.`,u&L.1ٜLk9W31ڜ8sRʋjX^z^WzzUYf7tM'q:t/A;yfm"I^ X>ȬL\iej, Ltm]-@P?!) $5 L&uY!8#fd] 9vҊr+\v v=%u0hDVD*sfEYߌ^xmu,zq@Pqf"18H$r -:JAEB 5 ģ RVERIh8V˳Rjt+~<#JV#JsĞ#xa1J.kS[˒fF #i '$9(%|ݺ- ]֤<&">RHH1&-r "Ԯ[Y 9S']uVÒ9%baJ [ Rڤ2AZ1 IJ"Ѕ 1ŧaǮC8~,啽pY>z&r՘HmG6-)NoVuIp/#`ԑ?뛿 .YDHOwT9?`p$`l޼y"\V]}Y JoM;mD{ 4^ h?*mG_]sɼ^^A<0Ja||̒l(~IvMJp-Hg\2qpNx'<ɔX]3jrtN1 e"3=7]B |f< F5q~oHbPvӼN/Wj4C:'qF4&sI%~Ei<[3̚"}lMT=}jz:ͪ`v039leY`4r5m3ڿ9q  LHk=y0h8h'Z׋܌yv~6]79:GlIv1W:uR< 2"2- yP!?ѿN5oFǃxvqJ_~۟ޕ_tc.zstV`J A'zkj5j1={`9ta\}y=Jo&]/yx=n9igE (dGŅI]wbk۲7HIu$sIKf1RHdf BT(2Lɩl=+ぁ<:s̥i]4U}YBYx+ m$ ^&˴<5fOWTLpzu+YpⱕyJُ UƳL/#VJ/Ln(d؈I6Yv_M]O6Y}1$0ǔYnl8>Q /c(QڪP317GY>qQz|5m (לpWfbqs\2N3l -dGm6|LDBHm#OrF8D21Zc(  U.C!Zc2|swQgq̓4=ݭ^qk/ny"^ɤLj׮@쥤-$}fMMVӈw"cWE=!!G F9Q2Z[I, s Lĸ$NoNVXO0\>O@ϧ8L]93?әu .ѻw][?ۛFGϣJ%ů1ےzIzCۤźϤ[<]ek:,ϛ[+߻_j*xe[68{QeD=IGn{v*:A9uSR4'/9̕vU5l_U<"(]}J_:Gbz4G*[6H[ VDz䭟zz3{y- cLcAj'$O;/ ZjfMHp^K!r.<:DM_{f=ǟF+ckoUq"W!MMwBz%\g;d4<1db Yُ/hsCy D;ő ~^ 5x.R4H} @[I:YP6JP )͍&6K`G;뽾C2$l^^-ŨzœHK1h| PF-6<ᖑoO*xSD˴mlќgږȑmbJ%qsX VA`[Cu ]]&|%\^[m޲gKk4/ŭb~٠O<\,̻8y3ATSH]'߉P0F5OjKE5ꥣbQQ9ӬPCt(g BRhtѐr#|nΒy_c}}U(Kz%Np>}`Ѿ0Eoeگ`>6;"$0[r *cC.w)<窺 /XuC TxbI:R!31 U"@ QJBFS"&*YW"w/Uy}]KY (=[dRR&'F6%Һf]ppiFviy~ڸ3-m%qڒ˟3aڀ2Vz,6ÄW_ Jpϝ)״75j\:ʍ\#>_J<3𦖉W`q6tˋi6!;#1(Rh t轪D3_Ǚ{IIg?[ |agk[/pw(;]Ru o9uuB:uB̥X2*7⨯*]A֛OF5Clt]tz@צ+!-΋H+KBHK5.hB]t! םW.+ގh!v38Z-хJVT G(xXHG袉NzU6k%!܄!cD1?DUY ;rQϻ_Ҝ?b}K >FJ3R9deUsZa$UTԺogA~v`Ka.]z@K7]mߙxLm 9KZ9I-&ƅֆ,<+(הMqOʘȦ~(uɡմm[A%YOZSHS"(A% uZ`l]/p#jCTf2) n~BމfԻ-YScսMU1fXW)2$ɑ -j_%K1~~fۭj IM $j *!%gy^E&i!Et.PS=cgSZ xD҃cvA剻lPA≜6HcЍpԳ^~ lsХCT (b|= lDZ+GHѳ5e-_Q1*R~Zeӯ, \VNRY<(0Q^TeD(RaGSLaPt8Fy!~2,?ru%[:O^{H\$ Oʅ_"E Rвfc+[!؋Vx͒4vv϶v$SQ@M% O /G8P*[p컅JvԶv+"G3_N?LG7#GyU{z6΂6Dt~[d8A -o~b?]~QZѝ'Z}%,_jiM``\֡stri9:|բSLf!nkO`l[7j{7.lfST~b4;`gDoMn`Juz@>nԎ _/6fUcu"gQ;,ll2[^>'??}_~w_wQ*߱-CO?49뿽qKVUuaM-oUG?^,e2Zgz}[C@u12Wx,{gEɜȲmbՄ,'D{8]b%jذx{Շ1#BS/a6ulc$9_Ed*B=9K(R1u(j&,cҋ k2qk0p>Xͩ1hb~\>k5(T"+ipi3(~3JXy{]e:,Cy?kr1~j+,Kc,{ސR U^mPGE9zLGObL&9Б Oi zX~-ߝ70;͟id 1#P P.J*9#J*|)XoOGz {"Yp&xBm0G2ALӥHEgAHktd}{C>w9V$;~[}\:mfݹ>AiDxxciή|dO7]r ;1vPR<҅r&$t17Z:Ȏ_:HYiP=جl5: r(%RYSIIEљ(6N"5Eyᓗ(':e) @RspE2 N1dkƋ͆s `iT4>ro@[h,IT/ol`ua[{ջhH<uu{<}upf4‡7}~ܚ/s:ȝCgwXk?X*۾ww|wnyy:g=ܳ?';;[I/ZydqnyTg6QITn z`Vq+:J )  pR.dI{bkk 1zWr.)JMGo\.!̈ʗVG) Fel6Vif Icmamᛲw6Wox~z*O.NN'Wn22l`RI)BWHշ$! M:m-OC bVBfMZHS!± [@o,oCͤc(V;V{D$>QDfHጌ::L2hPdLX&{$oĤqDP{\yG&ȹAz$c4_SyWͰS;֢à8q`+@+]X߿?}}=:iR"uxj =nb&e>F+,DhyS:MLy5>*${1|btRv vi?웎m1kޥFsiQ˚B@QoI;ftv2&"rHXl@ S _{QG{\x3vY2Q'7ǣ*`r0NA'8[bnUkWEJ-诌~^wUX&K/DkN0 oi; o7>'«Y0HaG OFИ@5f~;g P1C d..x)jZN7;!JKx'V"=FɕH?KlMa{5|$;Μ6|ޯA9cz0,(Zwse"eFBI.%duC|5J٩_S ?^\m/ޝ}ЧR<,)kIypw' ` ,%KXn,^ѬA#D#^xP bx xȊ;Ą" 8ADENR|rƌ  9 :&ey$ m4F E)VNyNNwATV ٮ~ rHwN7[+[aJ|xC7vkA#w<\OQi(=`ägM$k #dnAddGEIꨬV!{QrXrH`PI+ R$H9 jt*|.OigMKVssaҝUM^NΆ3=0 \N$@Qg0Fo; O16&?&CYnvނֺ owǒ7 ыp>oxμS\rG 5׿VnnJ麞c_v,ژE!hH<ƘBd`AQfS u*@nӞ״fý /(FtT>D)"dab.ţ1!2#9O*4)GΜA?(*Jp`oiKjokE]=J?+UOFT n \Tυf6q%j0nEl br{ t5Y# 8bnw9`5Ϙ͹#M]CDIͧhEXBcA-Ͳ{3&N: ӓO.)fh"~<}*oJziM@75XKR4h.'gyivy;y( hORg5egDo@`[Z\Uj o6۞tc+WCre7~Pq?GmCm(= dٞ!w;ҍ^xkQX#r.[h_}QS>:YGV^S6K_E1ٵ}& =gޣ-;b6{6峍'WE6-بQئKtb,xs;9T.OjJ#KjWVnP>ͬB 8lեьo~D͒TiW@^wWk&-I}.ܤķbIχoeKJ|%׺kQ"DŴ6LЊ)]Dl1F _1A)NcbL3ɚ3glW:+8\X]nQˢWZ+0, o6$5 Yu;7<5U.CG r ` ;\U._r|."<ȍc.P=1]XވJeR4F86!K{])ytC3Ape }uZOL 4EtY! $)dQ&WsRȱngMh O!jWy2f0"x6)d<"g{Wѡ}~ U_}9DʕΕhz٣[+Z{F(ީR˴灜:p1Dgȼr<ģLҨ\zNΒ|pjC >qHJȭA뜐T")/q7 e2S\ 6U32N!H#lCEH1QPifLɠ֞(uVVj(g\ټbb]1H;(J6FR4CrHWIDpn"wӏdXMBd4'H\D0r P8_mыY6ѷ]r}vqD'j=JB_&;tfXo],MJaL]ߵhvݕ_"5:ūx:S/_e_͓0IhMb<97>e]z19@zy2e?zZqwh< fvzZYwY;]Hoovk>ԃyn`8q\恁hM;Oп6jQ:Z幉I8~m0.,]~qiVEnTǛ46.A&ŧQ5JǷiJ`h7Y )/gHRqe g2TBKP6(LyP.a*.Xq!T*.t*.t7`GZ #m ڱ>fܤ`rFxH1pZGaz#Sdlh1mCAV`Vm~@l=4o;8.Z=aBw>tw^)=<4e1Bth<,O1LdK2YS;{wI^ zfwlQ4g?[d^[Ý Wq\=c >4"WfRx*i&ZefR8|i&B]>՜]~4FH&85\nl61n9zx[A% |FX$Ly][o[9+¾Eo73`Ӎv&x([x~dN đ%,?V<9-ctJ.%%37x6yo1,F3 gW%G5r+A1]yZn\MG}7]m7Yl۾7_=ڨw>ߣ%eۚ;c)>Lw;zzhԠw-GV&A$͕+zoȠ!t{UqҺYX~j뽫QQkƻ;=zhJ^Vny'i{[{GGjdr[6f%k͚鯚_?nysotsJvP ɔeL ĺKэbi7p|Y㖘mllj٘Np "iaIe".ıGe?*D88[l0!-KR] e9WPښI6^[e## 0y>cz 8XTIT+!pv-s%g3BI@CZnԄL[ά*fن$ZByB"IeZ3*ojH\q63KҩTөlJ%5ҩtٰ Oh~8rT'Qj4m,(ŀ1RLQdt"'aeTdXࢯcYeHV zR0HI8'R"i{%30- % f"0fH~wʨPVm38OjaiaY4IEhPQځYaM >716ZGiKV G)=7 A=mб=c[Ƀ3cTaNQDu Ȥrpȁ>$%)ZeFlEB.p΢ c76w/L*&.K5壶I?jTFc%VW{˝5:TUâ!r&RH{"(!KԂm;grvuz yqwlS#[a* *q:Ƣ hM;"9I YvF;Qe(b܋4IK)-~1ǵYx=}?"ыhY)s2!EO2vJ5xP"SX,$=i}b/I8Mh˛\zcR*2!$O`R*E!bFvF 9y4WB^d&KN\ HΔTepKMȶi~B/i1gmn [s(w0,,~k>F?`@X p>jsx^,2K-_lA8ӈVEA׫Q2z^#!UX+vv ™[?9/ߞmVŒCzR,yJ̗ԁi&=&7BerP, aub79}& \ #mAy"߬kq|x{5|Q p9\gaF4&Y'6K]9qYfoŨ23Όփer@&~{Ŧ`1Y}Mxtq\YaoWv82L'?]//.d4ii&~p$o?lK1M-> U,t2Y-vb,&7j\Bv;y&LR>B`a0̓VC8ty?>aKxegA]];ᗿ|~~ky҉Z!Vg$GYˋz~J\jX>ڧ*īOFP^oVwarRFE (2tt_!ʉ.9".DaUX-0dzUE[u=7p2-\@8/.<}V,I:<w#) wB$1cvuь, 2[[D2JA/dhIGƆ41׎<*i>,KqIg˹u&@M" ^DÔ r*eGpөr3Oc[2o>!c M> V VZћ;BF @k:Y|bc\:j>LbEG ~şb4XNwdE: Hi]yn%>YE•`3F#o)/N&'H52icAV (GϬ~"E-D=QG !)e.zIinS>`(QJ b sh,,|U)F"J$-cMҙl<>?[-rvˇ4:f-Wy?s-FOR¼r|ИY|^vlQͯpI(ٍnaYO T}}̧MZ|0' @Dd#*}QKI$r0G>x-wG߆~8N(_d*>*Ӎe^YHGӃm9Y@l>L]0ny>@#LkU*5%8^//N_8ȳhqy-s9<wn3Yf9ߧ^b-ػڣw 93?ouyznXkJw_g1)C> V%w#޷.p} u%ckn;LvS.1P-ΡU^@p%~-2pSV)6דѽmŲHY,iԒ,e&8Rfs'Zl0VF4k%VG' 'D@cppI7^\KRV %.H5<&xV)Xk, IkÜuFױsh[w{%5u$"}$vd}}ʍ@  Ͷ6M[uzqL+^W\i1`Tx7!p:hD(r9?H UrN]Ep;56MUE~{=˄s/2mͫp˄:o}{x±yf2~C_k]'V eiX@R+UI_P}#sX̺LS-xzBZszǫT<4e> ]fw<l((=6X38NSl+JgX^FYD A`J-ME%1Ax'-GU::aH:+,&պsB4 eoB]&Uޣ>~0Xmn]y6ǓqǡlםXgOS仟Obj|_<\d?"p?' kTjZ6i@T$7OTL 8J+-g0?E@[8Wo|glpLZ+r@*f)e'1xkb"ƙ!PtOz=A8ͧ N(E~)eY1wΏ \BR;&6cןx^ .(MDC=RPăFz#YNTk#\H?Eoan8R?OM;߷%to^p9\5Ru.YZO=#+[)dzYR2J峅cgwyUvnFCY9[pf}{=9ӡwֶlk/=Z'y<4&-FWVzKp Pp h7ƋGxr;c(O+D"aDH=$y\ȷL/ a^.S<N:~>3DjR͇վLa~[G*lT*ee^UE\O̧e#'tUIA@"LwWNS>)e-G4b4W>H@+:E'x^CCیz( vnRx=~5N~̾fp9XIYnɬ2s=ϵVdEd ~ʎèI)q\zjTt^4Ͳpvхnv$wi݈=[;n(Y:#A#6OW\!jN߸#n+]C}PB V}~{jN-vqWqX[țoo9g$6/ЕZSv-]D^BcCES{Eӧ}I'G{I6BI5y}"`(s6`g Vؼ~Ops89Yx#KGZvC9̦Cv{t]듛8[giӬ ;5[/&Ygr&7MOQ]KoRmv÷->} ѽOSnEϓ˓#ROx70Fߦ(=Fh[Ƴ-m._|w06 >I~

| 2@W@{՚(kQG H"9RbeRzqzytY݇_oQR]<ӛ[1<#. y6us U2'Q'I Df"@8Ȭ%V0b)eɱ&#9$SDUί\:˪ָ < + 0`oWWdzwsg9v٥!KྐKRo: q[*ՃHқӧLaBx5ǒɳ^q~6ϛB|寫=Ņvz&(޹*hWPZ3 0awl((=vL38NS8ⰼč!b/k4D A`J-ME%1Ax'-GU::S;Z:>'O8x|xcj`4? xpuV`qk?w=:Oiu,+i,|W"V+R#"܏zNzZ@a̓[FR46nR%kd&Y#+$kdk5;d V5`&9喡@+4qWh'J]wʣ2mQZXYiH'%C{+ht:;?~Ub=3Nh~x RMȣW›Q5LɯH)Yw4}% "@>ټ>гpuqWkiIp&v$f6w܃n#NŇ'.:5iyBc(@GhY޽ux~&PZC:w-R?FT}dיh_$- tTj,''.R%ԖJO-%sV ٷbÛlY]4?Uqz0WXɒC FDB"X,(ͥpxfDFd kup0g:rSZ9ކlu. '@EΥ9ID))ucۋk3SRhiTtyc||C`j%N^Z9KdOb77M03DupH#H1.'! ɍljd$E&ܶ}oMN+dVг1bJw\R^J*qu~_"] BeڌG4Bd5#-5!YT=n}Ɔ"D@xXē -D\bj$#XC4(t%[eزB1k1(i,A`!3,JEu$x{-jvFl,zY[((Vm<Ȕ+XQ[BEEg:Ad hHr&Aµ7_6_dp,K^[\RE %l0 ݳ3yَJ:I`0,fEEiTF<Tx=± T6{Q#bE")W;N[5y9rjᩆJ mBf6A dWb{ϵ [] J+;o;ƨ͐jPoRP?<#w(c;qYo}B DchVgX,HX`X&!!.&ap(gC'ǀ:<ǂ. T& /fAGmM,gDP(qHs7 pKW)"ygx)Z6̿?劆4ZU(1RL]9x5{($8'-AiTT0ȫ=L[ %DgPlJDA Jc d!*DbٷGB٦#j|t){L!T8M23S>r<_;b+ylpFqr|c,HQ¹G)NƠBNȈ My;w3b1񃝼p2QӏN;c/q ﻐ FLf{DҜM/= A#/m3TrldI.stRB8$4i̐2}xꘆ'/5-t_hzRE LԴƨVehT-.ohm{:T j,Wmd#p \x^ 1v&cťJrC[%ٞ᫈;OmDvtMїua8 w<}X!oݷۧsL㋸Kҩ4"%V,XX;KK~K4 Q(G ^+N;~DДA`,ZJ q+9!B4c[$A^oBIVv)RK5/(g^pZGe?H3Id^l5i^cwqXT Y$ fZaCBBi $?y B!'ho:"6p7æ2,jBXYeflNHnC:? ƚytb94"Fv 532 R)yQGޫ]D<_5rn]곣C6r$/m.937FƐoYx?̵nq\w|a7:rr8{M;gvfgRC5 #x@+9g6BNzpߦn{OSمzYuX"vk-);9/s ( z31mʟn982ca,%NPz,6 )\Q=E>12t]ڪ7a?q >??`+E [7%6 OFɣ/țN+A N\Cal=Fkg>j6#YxĀU/:HՕ"[`q GǺ6 U+q3|/n] "o+yңZZ r7FdlLz4-dđpUR ҵ% ΠίF6e;/>Z7߼GhvWUH6R VxXyiF&HZ _b4zȕBZ)Q+2%qY{Zkh=]BB'a9ֱ g-ҭ?[xh) mcOɇ>[Z␫Ҕ0r*"ҬEH*r+۩ -[]pM &J#DA; \7!h︥j) R ^s#ѧM[ӎ5MT~r/?9a⎥]$8<y8M?:m9@hy>$uِ@6'O1q^I\(~G7ty5Ãv(u%t"2G}^͏k]5KfOi67>m_o?\.D^e'lVZkeVZkeVZkeVZkeVZkeVZkeVZkeVZkeVZkeVZkeVZkeVeω|ys0װ?a22? sH4wE%RHI %RHI %RHI %RHI %RHI %RHI %RHI %RHI %RHI %RHI %RHI %: I;R%YoI{ =2 t?s><-(܃ J=;nvxO'Gom7ӻc {ە8ꎭ ;t[PtvP,^!u~|Tp1> q+0?Ocu˖Me%9+#H"0Cc2K,{4T[;4>m+G?rG6kf."C>׌Y4MʔEbk Y<4?}98YDGrrv^=&6?Nls>q_}iBzilxiW곁oxih.+@?͊/M]N)~s%lCN~*l>/(lwx'l`i6W\as6W\as6W\as6W\as6W\as6W\as6W\as6W\as6W\as6W\asB؜![s5>>Z'J c'}"J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) @_nֱ>^ù0K7 Nݿ $g.\1<p)\O\ 8%3˧dp<;֯|833|lo99ӞI&G'49-|jcjꃆYLI%H!t!7JH]rjA55X hKYgp2snf9s_yM4RyC}9}s4@v)vp|H+|sFc__?z|{P4vQ7En.HP0BV;|/j鱛fhn@vՆzɈ%i.T<K"XUChPd?9\0z\rv_gSΔd=ޓ;}i(0Ζs8awԣ779*~=b.!s2z g60:}IS:(NKB.6V~cKX(p#ᇳ֎Rx<Ѝmܴσ#I(K2%oO>"xϮ~~trzO/MH/ /W]>5oxGS|rwϫ+-sURM.nyghE"1I+,X72உcI8e .ow뾟OO<{T¹z-9s]z~~$yh^߮$Sam)mmz2 25g,\8vAm?v8'O%C.>{|w,a@6N`$Wi+A2V 30$@^c]x\D2yht~ƋNfaU=8%\(Kc?Dg=jgNK/v{^ze3<Mzr.M>}_]l@.v"dv%涋07nTX7,n@OhJ (k|_!;3 C5CqHJd*[Q^n>(mUGuT̹]~M%V>zj|px1(l%aĞ).; umj#vƨxϠ.7;77Xb~۸`QMR+=z2xHkAz;ϖM̞#D\u|EͰP ZeS]z~&oU'x_5o:yypr_ECWT޿ׯo=.dqhv*zx2ٻ6dW~sv_ !$'A`N ar(z~g")qղ-q=_WUWWqB.d^ ܇2|>)}E))\2`)"'W<]ԝ vz{R-/OTJ`MP&}\:$U.lz] 9qgivϒ 2x2~|aYSmqej[鍼YҫqG`2aiu4W  n`k'.`FF|_lm]4͓k ~]T5O~|wh?x`Fb4&8_43Kzf7#l>zYWz{afwS; 7 ZU;{b|wO]ݐnfSTa6 f0bjur4애kXabh=*EXz% < N .(kzH=D ɬy\_~o?}û3L={wf` I' ,,sA׎wӦ_umaMݶPUJ^n濟Tn߮Σj7|I\_||8ٻlnN*E]"X YXMUǻYE+}:~/e8.} !Ot K%Q0낀_u'HEmVvm$A"`И>et*<PµtkA)6CLaGs;V2&+@x+h+ |pӾ6SM@u2!$.F'B4_a3,g: {&_I!';Gso)XgZ5|Fz) %W׵+9?_ddJ\b ȢJ/mPvt)4 ?[HvA w1qUn j1-U1Y``Y-²MQ&4a';r8zc_`UU71τ֩ ==9Ҏؑ$e<]%gV'L\j:&~bEa/fh4lz+o>*Y*Ie\8`X*Z sH[]FR3bePQ wRZ7Eq=h+^o UCz3[dj IUpJ. cF@SHcĜE,!,8Z;-ʧƿm1wh/B^zSHhQqsf%`݇ڟ|RNb ]y7>7u?݆CaL\TL[6nװƒU˃|ݖc1t~?>5$z\SHoQy"mm Ï闱L \69t8kT5]:cLɢ5 &:J R3p/sa_!>B,tXdB3 d5M`J1B;pe3\T2.hdI*&dN$C0!B*Caj3jX$"﵌4Jlp45[tf>NϘt\ӏ+fmu7L%fq{/[K6Vom7=yo:4ֵ5tlnnwW˚fg xҗɽUs!*I~҉UŤuRAZO˦~0wmIw6kвe+׍'{Ph_z Sc"rƉKSoԘHD_xەh|#fZT'6qJłh_"Xɴ̖P1 *=z_èathFʆa<FBNy((p4K&YR&%,K'ɣ1xXyM t1us͓"DfB;ΰg(Xp(!;SM麟7+{_!IqDQ3' q(7S{+%R.\H(n!\&'^Jr\l #%POF.Y(ҽg$H UȄH&3fEFd>ZRP+QLzSNq飶ăFbduy5sf"TndȘOWɆ$P,#n>/fQy󼬺\{n7 J3oj#tyqPBKG@!$i5,hrL[D1M !{D%$hpY$R41хT;;#9&CAm=xdG$džA1!4#pn:bcWɸk<0rU3 4Cv4@УH PCJNJ'R`E.6SćCAlq "R)qDĽZ$yƽԚ9 2/D ׂa&1GR  6QKFbS T"(&L̥1;#EWmU|٤P\\`G\_ #bVaEs؆Uȅ󄂑3amc!XG\| \ f5^2F9c**{_ePMGhuՃO\ `o0X*7jl`f6}\I8_o7ๆ5FXk])ߋ|40wpdkeFO$GL_>xؠ)JFdTmؙYJSg9hz_Yș~x008:K}Sau"Lhm^T&WW[;vcmWzrR^il^Y\>WHύ.%Ǣd$n|T2A%v0L8J2},pdpTzoT\ GW].[P\WK6jW$\i}m Nu[R߱o+XÚWNśu%md9_볙HUbveTTF\"d-Upr޺@)>!TfV9๩.w9"*6N߼DT/G1֌@PTQ{< 4\ äVDC0p pWf}Jkџ8].z~=t}&fLg1&^'HxT/[3$OᝢCJaDGز'{#O ayF,B0#BXYL^jʈhA #(ЀG!eLD:s?O:A%(e o;3ǎ>W'oL[I( {NIJ%xydxi͈22ڨJ&*'dQq˽0w_F1"þQАo3K/κo:"xд@R9M"(&[K`i" )1+CQ{PH+6^ԖivD> NJ^=f/{oꫀ;gZ 񂉟 ] u}4mv׉>_a;$p٧{|“CD5"ԂP\6% 1ɀsхGN!(d>" BrV!&pQ0Ed(e1-dgZWredNBZP$EAQUtWmV!ҕ) Է(UspFCo˯ 始k`xznX| }aY'`Ǝ0kzx**#Iui"*Jy)%uhW죴8W+J8. sH@tFkI 2-BAq6LˋEq`W:`bDGA 3{Jvr!w˟mlI}><Dž8l8Z$`Clt~t[, ݚ#[e=_QH*F9X,"ZDm/jMTt.o֠/.J)3Ξ;s'E5˞pD^ՓYU2lb[ޣͥpMPɠNBj89&XOgg:OλC"$u37P'v- l(0Ƣ#ٰgkuڋtbdI)O6QD1 X?RLH,yORO#]bW8vK>4k)bjrȪ_u5$tBE A-\7wpS}ng֐Ar.\S ZE,bbIp[c{܀/w zW\+|RYL&UPvsHԓ넄9䙒M5AAm\Nds3W08emUv+cאnr_׬05Ukmvxwb|a 'i=J!xņzKw>пO (5T+ 'th3gHH\ JdlhWD*뜨#oJ^ݻ"(Z !4BOw"1þE~k5u'/q6m{ O ]1Cg/H8# ݇BhI. Gi, BcVmAV'%wP~kܥ Ch !OniMI, 䩄+ $4}֋߂sI/a$B }J!ňF plŗ\tk6@S Y1V9cߟwO{ײo?pӝ? v{~fL{>}^ǧL/pk,LLDB )TipG7>~ÔEؕXcf)L9H-"zɦ8R"4pA!Cm/lĺfjͩYE4dO96Ml~5/ lI=3ti&O|y;͖_Wiݛo\M{rg:Aڪ4]geOԁ{qFu]vP[Oz}i{g=޴s\w-'m9]Y[_K󭖻ay~Owo|+C:Wt\I>}/xU8pcח3[/y)w+;#vЙm6sgZȏͩn(#-fyf5#] ]&PDe%yqSv{8D鸥$-Y)(3V>d#!K$s1jXַff>o Gϑ y.`ǟ8!& } PPpKPn.HzD9Ta ɱʹz}lU[E6PKJy Ke1 F%# ;eqf?Jj ʇ( TNl hiD &78$GA]2242FLVh+y4#cH"*l uh'kP 5:28 ,㡱0ϱE;9;2"_.VoDŇ+؉y 0ԦQĐZLQ_guV0bFlކN:Kq#5\SSngPAdK3pQYE1%jcy3eEJQJ ͚,yJAgcP["K"F5fXެ-(w"C ʢً8ʉC 5FTj }QYhaפs#7c#7sD#.wF ZִEe5kl(YWH5bWBC#bFkLLXx[TXL@F 2iIg#쏈_ދ>qqutLkìqqRs\j#NEc 8)+ Jy5H.MHqu%c!l`zUig~pkn$ӻ;;3f?1;A(mB:TC \}_>S|y 䭶m 4Fg*-Er[sv.˲^*]HɒuTA83C!$,U\@b)ۦÆs`M4=|tc#}IjCm8G {|KjikV;;̫S-[7&^|^mm k>Vv)K)Ɛ%Z h!pdmRRGC5Mbp?kR}'g[/rg8t7oYHhM Ԣ/QKFBK >4SSMMg(JfxO)h r)Ɖ8(S$3[7SfGHZ\/lp`S8I.Zo&tնdSmgRllYXt!K]OV_oO[.Oofae]­/㽍T[ӭk{IUVzr'm.v&|y+~.uu ƿI{R>XbN6eD䬮wwV}g{7 %㬱޸7ͻj%+,0{D6@:;Nziפ,UJ$n>.ٖbuñDώ.)PAll[  qz kB:DCR¤Gp%4НTF$O|egX O~ hJ1H=ji_CGGW`2qQy6I.rU1&e&gɣdrrF&?hz0KLk2Q2{ClbvZ97߆y Of-KUˮGyt1;>3iG!+ڒbH}1R:Tރ ?A VLx. ]^9<N﫜FtJ֕ZSWka!!`ĥE$:xP )d*`L:lp|/~|oӿ_݇N^}:9?/S`j!!O*)>lYtP*?]|{7%7ŦWeO+"e0iCrS@c|vSYdE]T@+,*A_Fb> 6)*הכR“WǴQ.y^]6^"(35nmځ=~Vk$k$IPaȅGL+#s<ȒQyT9 1qHmlXKVIm4@;\+myQÍNJ(O,lh+$.F'B4_a=Ft0Ł1:0!Mt>FJ'?;Ԫmiv̶b,&Û_d\j^.t]/  Ee!٠Lf,b_^G:.ųw͚u=f8/=JJ$[Dm&,hiIً*hY'$i,rvqA#X+ s=_ϛvmؕ[ 2_;9iGJj;T+0L ʮM( ~7@]ܬOOW[V''[ZC)PJnAW]kN]`yg*e+tJvJ(0fTt\BWkW %!=]=Ct+=u|(gssxVWތ#l7Βzhtad>#$2@|yj3bi) K'3JeB+EYPg ib: U>(MmmLaw| 5fpϮyCstʥC̋x>˹`1X6GETkⶨ[+Z8IeeC*OA:(Ҽ1|,|k>ڿShn ~} %,@2Fϴ,sJtaZ|M =K`RGQJmƌ6>fimw1;qʎXUXΨ JjvPRo};ۡ-]-JtkcBXU[{}mJ(!0!J ]\IW*tP23+JCtՈvZzJ()9ҕtEYiӁ)xg4!#,8L)!t퉩%{$W}WpuyLqnc\g8BިFޣ mc`bҙVgX(ŪK6L茖R-#ZFB)zs28匉UXg rĺBW -mmG=]= ] XJ:CW . '}>UBkWϒ$Â;UBKe*dvJ1NHXtGJp ]%;d4StԞKJpUgLV]2$]َ2w?x͛c^ڻ,_se#g+UbG&^ ?Խx0Uj:a,V+'!3 >; %_R٪z| 8_VV[/+,:+Ň7 )L\6І"{>MmLD|)FiԈS\)H4(D`Bh$ ˱V΍,fcK+E1&]PҠmPVZ*I`ci9I1g1J(.MKz0 e0,0Wi*u}HDMC[Mwg&ŕX/JT8Ŏ(m'az;iq{dD0[jzmLr:g$7"z-) kk1(O( R8J}c."RШdl$FX1 -afDHH!UR aѴa9k ΆrV߁U8oǐMD@kL *YD5F))rU]2L`0/;捀'Ę^/<<9¤'H !Q Ay  r0)cIBXcaիZZn:I:i)3YQNL59"]!0^3K5=+ݖ/)yI@ XY$1|3L2G#QxOڨvisHxc"|KӇkyoMQ|< ܔ*z x#5!m-d@*hI((` 2PSdkI,"bח :l56Dv>Ұm +GuS5AKAi 8)dT% 9ce_E:!JmX:s4^ibufthAJ͓,5v՚h*ѶZv՛m/0pF%Md4xf4_]L& n 5س4,BHaЇ|:^,~Iy_y:\Y ~udt6d= HKF!N>[Feh:*Ws&xz-Ghtx j62L_h5jOmw3[zDhYf2>`yDÿl±\psTnr"e%hљ o]$Wu?y+h^ WwoЌuCs%ػ{L4: .J.h;fE O}VjA6層Fz&[cCzo@Hm_jRb1VZkqݶzzͰ˴< ps?D]q1Քb ,)KW u*`pmhݛFͯטQ?0Z?__٤oi'n`rHփMaeqnQs:>F[oa]L(.ya]^k:me'l{YN%VZ4i4!ibcuހJiqA"T*hZ KJ0GhD[w$QHhH~*@z8! r\FRaOrCNTB)7j<[֑a$cZ)"VB"INY ]c9=%˚Q}BrF>jV;{Ho/F ziV>zTL%4nޘw~1LJ_hUKnJ'3gX&SLG}pYjz򙪗W/[^"Q`PlT OQ'#LI̓בKAÒ`$76]e&#QHYu2LwZ D佖 MV#} Y^VЌۤX|::}h&Kz2dZ^;7p?Ws벜,]Y噔4&,yxZY0 Z.|YIY\?`Iǫ]Td:USl9rM4OۚU2|n}{+3{lJä(o~7:o2{uMބ"_5xܴx/zߝ-ɮywgc>ob}._!-WWniո67~d4j9"76:;m`Ɏłh!XƴfP1 */hɮڽdmKvڳd KKv,DZAilpGrH嬼}my'AFNQ 1@IL A1 .zx(3 %ZIt% POp).}ԖxШT ./cLjUFƮ\{.^͙^$͸b e馣g 23_Gr3tyq PBKG`!$mFiLO<:Ni''֣X"Y<ݑ"%ESGϓX`_X쪤QK &ljIQw+f2dcJs,6Q4xɝAQTBێhTc"^~Fðdb:Ej^j&wQ[ND *hߔKgU! _{;Iq$ YAiGH8օ$Pk"8ʨgȂP$bgDd[H/{}"PIFh˂$QΨP!MAuIVa'~_Y#Q՞faJ.\*5T*cnug9[|[ݦذ[H0HQ!j}V kTx%Mq1 j*#PJC!F(gNϝV;D!R'H WZ -*/q ZVz|\i#S",^F8Xf.0` ΐN7wb|A/9ֶ]M^b2xdJ ?sAX8lRSZ>ym?,hwK@)jCOY)q Hf8] \ ĴIz=X[Vi5)w)& cVPsH1ĜX{{Fꖑvbh.8rPFG|8̱DCWA}3ES,iq,R!v-nJc'_WgX _e7މzLWV2ta>I*-f~\V䤽g_rNIL9ol!Hq.(+?Py{-dqӝiLs4%C(`;%t/t"(y7[K-GJ8@Ll̗(D\p 9r4P2?l0^UJT-5Urxv1>^~TJ xv>=4ߙ#b(a$Q29Q:+m2uQ7Ϯ5g~zybY1[\t>xwV,37+)Ꮧ*33;\,y>焴UH̺.چŇL*oC*+W1|7c.M3uTJ6:dۨ-s]N=9$,|M|B1JEVpŨvkrЩDl0_~0>Cr/?9}W/7ߜ_^~)P}//N߼w? ̪"iG Y;Cŏ-C+ kBU?^0|;Fv=?#Ld- ːa!Gڻk ]-]-|A[Bu )q_"T"Dӷb#BuP ՀsҗjU]l$ث $#ZN|L1BT ܋H(ςI[s5@mv6̅KYq^; \ ep)%o%e&+QhDYD >%/c_N'g:8JSg4kIr$9)cדOI85jU4a;h.U:|?Uf ʊy*Qs"J-DcO>y*ҥБBvAQ<Y(5`"깸@XT)+lpT(\FP,W0q%m}{bpדn 17'}lq6i$Wx̘I.md\ʯus쵣'Ҏmw$Qűw^ƹTxdկ/k֓ i'*qkkbi K%7k(LjO_ðjs;&({/` 6(8RN~J1vV_fM?{O^˄:/F~=BKO9$> \.q}ۇA+~}=UAnLd0,RP]!+,=;nxwd̫[:@wL骣m}V/bY<#w [1A阍 I~T}:*O7?U׊5[Էd8x.rގ>)btT 5ףT> raz)R{l8w74nvM.L.]*)qĔN8S&ԩ{o~}bݏn|/uZ\ fgцNys`S Ni{vQ+atL`*bE<ːĂdeV,֣KL&Xܬ_bqy6|3!WgqX̰]ցPXA}6׸u~8D.d~q}D9`ۇ\lmZ/żNG {7+8ı~e\yMޗEdȂ@jR稢Y哊-v≎,6wG,()(tiO⣛jeqR!N%]ri5 vJP$OYZB IևybPN M sYmHPI p@e 6̓d@D+_bcetJZ 0X  #D1qI҆UAzN9 NQ\vF3dr8kxBPg3iHCh/iA%,vt>siZ@4EM`:#:PW}"1=KBQpFmDsos՛I;XWTym?k=ΠDw#/?}Wyyb4,'ott^{/z%WJ.!rb5*Z>b?OṓKoOu@OՑaE.BG'c@d;Ƹ2qD%t(QOw[rX39U\$C35DXPiI-jk@ՐhTc8[jgQ1~> rQ|ewdsB[>rvVG?EM>^04E )A0f* ƅ85oܴ-J* t{>`@$ cFƭFxI.KSV*m"S}]tV x2RxmY[XkMQWj+zi,K ҹe.Sh|姴 ;\2iӥ5JB̳/;Rv}t\p8o@#to7B̷Wć@+ "XQr>r*%O,({yvs)}J?v`8mE/{mT:ɫG 1I[#y^Y-x7\ TBGp ُf Na/!rCƽ` S$YDB)*& & DnkBDr2(Ũ@dT K#x d9XO; E>G!Iu@9^[Qa?ۮ>)vr1l۲`äZ泒*7ÃД:`}$)<_E.|Fu8'guU\U-đ$@.A\Z<& x[lspTe+ /`8C8V6A4M צ}m|s~7ī@fnB(1`1p[A[CmbXKwU=5:bt6-'[u[jK]Ƚ$޲ m$ֹdF֭+Zd2moj-bKYi[KOߍ7grR$]:k8=]>ލp̀UM;ݺW DcǦYz.Ǜ7mfFq/y_# `|7. 5~\SBR gPP9HlKawW><; C'.(MԜ)&`yiŀ>1O!^bؼYQs-9˝R 3.RAdna lI!fH ^ wE%| *{(fa1 n ۹R51Z/'\ #A*Γ ܢ,4sۆ%=rFg `hn)1H)hPj(EjTײF56YI'D"VP*J"1&O1;C Ց*ъa2쐏^GMC1e3KY5mCFWpΝ0m4AhN7L9ygD4x7J IXuJ3![pvMq`pVg >ܩ̡9Mg=7Sv;KyVBˤH@55.>+KJ _4H?G)0)np1^oD0'b<{5 /?ڌZAKԹRt0sQP&ʊpoB9w>&W 7$ PgNhJniB[NtZX(̲ Uŷ_6l3 \]w& 5y];aT2tCVI#" ޽Yvwrn̮W)I<*GBQʶfV^l< s-a͉R4g #¿e`CCCVC$lSOIs=1K2b'6#L i7%JqTQ( Ȣ`;X+\$JKu3ǵFzM.K{kcoҁW{;}mFp:\B=&5BM);bmətk mY0 .Xb+ p ;[+zwڹwz(2$Zt8Ab 2R@ 1;- \oU@-k@PF+1waئtF[ :aFhN[J/? g>9E\ >e"|,Ѓ _C6r/yS3A>^j)QaFtI'Gg?r>g~h)*y1vcLT}1C6ߗAUB;Fǣp鯹P(_ru}bRlY>FW 8ygZ1k1J5^ȣ ) x3-mAw"feڙzӎ:YR2Xei-/loEK7|/GbL]Z"W}KK@gP=bW`] }aWښᎲD%'v ٕXiG WbW\7JH:JT9+4Jr z~*_mڜm"._Gp:+U}Wz4xH CS4iGHP\42FDUII>9lmD.a®Tv]%*gɮb(G*+tbPIbɰ++-=CHQz?1!ˮI#G'B? 'JbhuvU{`W,=Fs$Hp$իtzqd.5p=K=+XT?W>l@ h^ a=m GUhf4F+U* D&iz|$009Ky7 wHS4͝C8g\1s)=3 kf:"}fh=Ր~ax'V(4/,:Ept!BeZ)#u+¥&SJ;f2Rʃp-1Kȳ[#gŢSnC񈬟&]%]B=aK6Z>y]AOn:' 2` δUK c ' 6TS'lĎ:jPcF́(k{>!'if HFǤ"9̰4N+  (O'@2qw P)2?C #':*?bAh^a9A-`-]|kth;HĨVe{?~pɚ3 1K_ K8)5UHp>d7JΥ,5"; TF"r|ϥSwlFpa{ǣ1yl(*:m1 !brA)&ΪB\eh0]_Yuvi 6Ry1jC,UQ.LF ?@N|g3UvlnDYzJ&_'o/Nl1#5%|c{Yg3[3dAqQO/BM=1{h4w#uc7ΰAR߻Lhg;^62`a-u2Ȧ^ RXG)X1P[1KlP%wۍ2є}X_>\#lo7ӿo_û7g/_=D7/A|30q# bc~݈ӟ^M.4bU)te߿\ms}:LKa)[ٌ B#fDI4hښɄPVX ZT`~[rC/e5>.|BP\MH3\(.BVC]ѝku$媞$(R0BĬ`/fc µtkA)=CLa:C&%/N]4=@SPƧk0A>JcIԁbStZ۷} 9;Zʀ5W/p Mnvv>puܘaƠSf90CK3|FX̓tkn~se ]8CၱgQ9c6([ʞARL2B;I%\p^z J$[Dm4ʲ$*N C1F: lp6HF'gAD%4b (Xe[IBE%FnXH.U/n B+*)W6:11g(/}sm -$!*1tny$XLjAP( 7Q?CS1*7\ UmN{ЬO=?V_p`$*8ʝ7\ Ø81gz4‚SIP2gT.L=o뻹Z# {6dɣ:> &IA&Phl">MCHҗk?o$nFw8Œtu3gρƦaƒɛkXLvcB}Kq j 3=瞷R-lfQn*w@jo z!xew0-tq1W(/xlWq'{f)gv[~ }U([U?ť*|yCV[vY`A trio}&ԗwSH7~n]oh.{ҷY@#R3,Z`b>,5hK5ŎX;*v(UDed5G`J1B;pkȥ Ӡ%#XѪE]`B<UX-zg՘IDk1ghnDlQo=jJ޼KuŚtׅg]OX[P׀/3^(5d:| $JA_"OWꕘJ r\ik)LRFuJ_U,o:I bRP.Ɵ˷7;vl ›9WufMﭖg/xTgko.`I{˭ 1,K;w;l,Q/qn'f-읇-یky VG2t)Q/,9\MNDP1QȮdLTjrs2jAjncJ4Wr~F؆b }b9Ӓ3KC4H¸% m #falw,kҾFjcS95 \:"MR @wHrw08`9lw` mg4#,_ղe[eeOl6a=zq6e_iX¸(Bu)ˡ BqpRACTޒ2zZys;K3.b)FO =IʍPZ :a6v5O&O5h6B`buc`VR#0N9Y~^ [m (S4$XԚ˘Df@$ȨQX}Z ѧP59+ڒ$`* io0.HL*LA JxVBeEp8gӁV/fP>91(HH*5 Lc Q@Y*+_xM*H/)$)F9ީ(K1dWܨw^&ZǃFd i GWYޮ$cӱxr"c)x8'-N01ig1h enr;U_e0eLf!9(rq<`I1TV*BzRl/c$c%9])x떵/:7VyVh[WEzVzs3#Я )C]ޙ=L\"d{4`m%IkLh]|ۗDTT "&wQN,-?RFFS9ʺ\ R(*K̂hjBE2i;ͽ H5dT>*[L gKs<)FH9"}sO>v_)hB)bjtf)SQ>_JCqSa#(A(EћT; HiLrrP|ʤxaQȒEAEo!d-6sEXI[Zù_WY@Kż^]{g_SNY>T:-ƓO;4h٫E7ٮnC,)??~ϋ/ե7m=jcpN( .5GmI]6M@Wdk???(>߶Yl 67z"hz={=L?`¾-6suŢ^Z}Swy]G+g0mn%۪V$nvvrr_%f-uk-G`~k[:ךok惲e~ x\<`\lvz|q`Aw2 Ş[ǻjů6M?шDHo'ϳ2bt6>A 3%tC(kutS>oߓ.p2ជ^WQv=y:_&vغPF ݰkyeA԰h]-6-v|rFYYsDH)!21*2*K#Z|vJZq[^Ч)^P G_vgt"O2s(HFuZf ŨrJ3M =Uc>i88?4ƮtyHPQ/KW?~b~5Wqȥg56ԸTcK`͚{|-XgtQٸv$ a[N&kLo^{z9+96nFnu2ķwބu:@;駝F6Dϯ րơ9:>Z:&xrVbz#9yF`:^-i{"]EGvTZ*其mA:8~3~LC="UYY$g%LC3|F ~7 ZJ, Jsf=|(>L~3hD͏8|?,Mq%m:|zb,f'mWO1oDJiI@F&\XL"e=ДF Fnlҫ>NJ&a;dCfaP]9ܛbiywb}dH1 f9Prmi&- Dž)3jW6K& [f0k跷W?7?ޏƗ52U%#%=7taf)Of+ՂZ2_ZbmV[iO[:!ڕ Z7>aHN/h!bi~eoBސ6' [J H4M0z}qCЩ뷏oRrGeOUS{ /X{g9Rfo.R=?`rF8)Y;z<3_\zkVO^* PRbIR=abQytt&Dd@cN~ ;Ӵf5gM΁%7s~ۓ"Y#K_e!9K"98j`μ.wGɻ?"Th&V6s?u1 ql]؟7J K3vd Y80v]vùz򙣷sztǎO(_vk$kzRo;k'jZ0ӵks癣stAߵphqŋZ\d4[̷2ހ.xrf@qi|1j!˓kW,){3\ա;;n!8N9 Fd6ZHB|Ց@r7<ßßßVcsC4 nb4 1%fTG%kh,xfDJB(IJRd\ZJ9C0Mt3qՆs?Bc ߵr&sOnMZ LyRQTU}ZPlaOɰ\%4^79FGL:J%/dT; JƬA )DP]"MG:Ƚg8$q KQQ9L"벧iUVy[B]Ngw!I/U YNqFP#reUpI+}8_hBKs3yTrҾP\7%x0s~aCn*0se%WI_-Hg96FdJHl,9ZOs"7}F?"ȊϤɊܪJ6E j:!~@_<_^1a^F+:UuN;hUEX,/|Aت'9If~DW~YcfSXJ'Jco5RxKt9Hnö3yp{q@{Qr_aBvLpZZem`1(KafI Yu^JyΜ&gzH_!K #&`|lK I /2aԊʯ?Ë(Y,5R-q}>L֘r(&=E)10qHw|Ysv{e I/m}fj!E~VWih[*Y>~Lsl/vSn*h/v^M!c:Xu}JI WR  Rd!J%Ш;o㢭ۙ.>iD&R-VŬ-R TA`u٘scvgd4B 5[0؆,7',R( e."uz';ֳΚzWfK-ZAQ<C4h{,hFV^%M']oE  ?PU]\88I+77AQm(I"˩IdÀs49"e${!ȁKȵ̀ud'Ag2q 'r,{ M m$AhYtFZ$?P\FnsK.0!Cu#Bm¦DO|ȡ':V奿=ԃ%˲`$倄Gb2ʑӊ448FRiXtVˬzt+g˛hGgrvȎCMi1яNܡD4++z#`YnLJiLH&/lb,=9ΚoɉNQ~Y#ch;8rS}NQzu<(Ho/A`'9v\r㯣Ցqiekk^wwt#ش:>?_o9{U-o4&K`o0y"_Y vh{D``duǂYZ+d12K+o#p#o_tUFPK{Ae4P_(廱$Y>F9FMݣ&Q>WP@fQ&k}hSW֞Eec F7;'E="CdL7C;Mbrį_*SOjI -]N0'-l8jCF0e+0v&pB+O>dO`МrQ3w@\sùcj-r}*u.B'.8w4~^^o ox/lHy+E??*#Xyr&,D0h!\s:!||Of$Zs 2`} [ty]UÀ:J C`*u {4 AYl#ZD*rAnN`H&2\6u!$G.=aG~ܙs0{_$/PK_y+/}奯W^K_y_A|RiQkTS+s%zDϕ= GD5-=WJ\+s%zDϕ蹷VJJ\J\+sͰ֕=WJ\+k̾Zt 1fvu9$X,P4 YjQDkW޹,8]NdӎwT hUd2Yё!n0եz!qce3~T3t˪tΠTO)<-i>"Xjeh`Ks;XZ=i +ֿ&RUX _a+ֿFĝEĪ^͊^q+.zگYW\^qя#ӧb R3?ytvF\ȍ.`l<+DwnXhXvs휱djC{fvvL 60!kK̒Bѡ#8xg]I:J %cFd9clpn&gRd/#D9p.,S]_>rxFg;C^m>迸lmG= NEgDH@ e:G"i0dj[w;"Զ+l M/י|c[,(]1)$<QVt6z]HA`Y-ѝz&N.oIȡ!;7AŰF?: +MJxw3َ֑( (˄lB&h:O6Y&Y,:kG (9鬓M7Bn,@j%HCM@}'$&e$Ϝ~92Cƴ׊6Sȓ#/LE E R^.'V(cGB$icdo R* *8c JjΞ մO[dOO7L˝5ߗ:)K HNF29HR;osvm)d>X!3K[J1SF0FXa l(@Eᆡ 4=)jл3qW[*O89{/e}k\ z;!YΜRm3^kk,THЊfTތc5Me:WZL$73&Wy1&I SsLԭ"tD )yLjQ yߌL)T枥R{m"饏dz_p6jc*g@xOHߚn; E>l(*ggc[YAc~(WW fx}D{A||{egZJn)([2Lp%'h)'~:ŋ09h 8kϖ[U.p)#s:(psUs 0~j?4sq5icO~oA-kס[6GS 'I%pYw9$_u &;^}uJ;=Z=^v'?]>/>x= mc.8#+Ȯm Ɠ/׳~6fcMbS5bs5mf]T>'OFbf9:ٶͷU.; }%춯rR*V$>\R4- Iꔟd~?}_(-j qxۇO?ʿ}_w?}ą;8ъdcG5@=f /Ps`Uua ,*/ލӏi,Z7v.s*^,~Ȕ,B-'ٻ&dEEl ] r@=t: 1اN?߯W:/RX@LWfw=x݀MG?Ï/G; H6HIc$x%CI&٥ɔAPdHN3Q&y1so6,6Y.=O{)))wŮq&)AeLcQc_qN'g:J{L&^I!';Gs%YLڎ1L'Z4ʆ֮(u>낮Tb7ASgEE|͖XK9(萊u$. :K.ĕ'w@JoIr٧`9j# KFW;-N'zj\zw>oF/IU~fdd*}ȝZ:xFX7#2Q9OcU(fk}ށ23y\%5AdD A[WEF`=i7]:$krT1yeLGYr4Nd,8ƵYr'?p }S^^fWlM';=)sY7wt1lz/) PK=\kZXPK+9T7Ȁq j_3}53x[o Xq٤YoTW޳ŗ(Bt:ie4: )L.+P$z<A G* l!\}d,%g2{ZpN]@:;>9{,-HWz{<Ϯn YwaA*W7+c9hdK|W&}NtMiörl0;^X\27B%˾ڴ$~6Y%n^t)P3o%$|JEo7vv3{{4BWƤ:mfCo6 iݿm]ɷ/.|Qڛj2R>#m{qϿ/ܛ]^oI0^WMd~9;y_;ޕ{zoȭn%V ıOяZ1߻6`SbkRøQMp\ՐBv]ʾԖCvH8cp޻hR9Da<&vBhM)TNkuȾaQ$/?Y{S1hh4 x-i-JX)QwY(G^x縹U 4fETJYSB/"Hs#wmIr!KuwKr}8\KL*^,_)(),s35=UO?]/6Zg o}(R\w:\fzc.*+iK,[6{_*oӁ 'džH ) N8}E[GJQPZ]cSdJ%FQg-2WR1P„̨`֕:Hi LWi4cW,~RXxm72 \}EЅ#K̼  (uraָB˿{6A$QhҾElCuBVBǨА,D )]aN &ޖDE$8#vy(L;K#EK$kkȠ3Lt m~EȲ:0i a1E4N %4MU&Q&đj}Cb.."ٺR3qީئǙ*0 "6ӏQŀ"YIR}Lk7R"e *I-H%W RBR1p)"zBX-}pPv Dc0Qaw яYF7t y֠d v`vh렠&'cEZ6k39}ArtњZ1yU *cTlr^ J+T1Kmy$Hv:m!v>yYa(Lc%*=OD(\T:ݐc CFr:Ib v*rb]@bP2DotxJ)w1a JB%&/3eDf#j>`l"[bDce Lp)({ٶY';oNd ۺ+~1hX|@A`d萎xZrgrӿ`90&L:{R ;I 2D熬=eEؾ_,dI!׍F)wh~JQܣOP5XNMҨ"a(W ͓vU˶^ xK^2f iUL |$cJR$QĹ߂`ǵY^dk(uW|zbpٯ{)m JĦQYf%jEmʰqjeMPlwQ21@9iUȐTc`h&J@w@yb~}3fh rSV3o[Y`q;j|yݲu >P5"ێqTHl5D0TvR< LgGy3 KC2P-n<lSeSr).:PֲZy O!T{)ό#Tl@%墤$pO!F%J^J7 BNjYAzZlmG P}B=.;m/d#ī/N} >ha\*5F3,LHΉq&e]$;s9v5Q`|ߩ{tܾټ/Ge:y?-Nk*1Ϻ2u l2PSFhon%є ԯh)|69=DWF|! ^*(Fnu,v<uF~f^^<f{(a0`?t-ŊXcWzZt!ɪ2Kn,.ۛK].ut*a6K-/&x] ضƶ,bKxZ} |P:X%k)<}p|v9ĐuRFRF;1\.on >ގpuS4.~1ifGLN%RhId7_yʄ;͞}3czwg.jh=ݸutwU'YEeZ\un煭K}>q aN bEcXuf1r˙oual*=ɜ3),؅7IHa` @ʫ04Y@XyBlsA>T*CQ32;Sp R2xcep"؜5ftv~z(eƮ ly(АvFLΧWN'т˫; hnYļƩoJ$sZpgWWlko^1ILR $J`Qh_ FDg{#u-驩dczbIdžxR00$U$rhmdI׍8[YoW3[ݡ5@xD(# :k"E_): 2򲳙` a0&pR[3?k 'X/ħt)WZ|MA{"Q C:N&˗1q-tD]:κ߳WMQ"ON3c xKW#CI >l!bQzp{t]Zuײ(sI vI2PkJ1 H"3|+0aٛ(]&"ƍŌnrzD+7o'H:yFG4yu2!SϒGAI}ӫߍ' ʾ6j:u26=۷|`yN+mIo''$WTJVm Rk&ڞWmlLL4t͙V3/3'Ӧw搛={=!9"be "䏅t\0i6#%ڠ6!i(bLty_@BDQzDLEr ]Z.7EDFBRG-;/ͺݗAn"ܽj=g@1b:4Oݲڢ|%D]iXf|4[ h oJJuShI6^VBu[Wi|k(tuBۓ݋xy9kEvjcxv +ku.^EeTkބP"e:} [z$5hn y0'y.Fi1uo2Np_=DbxDPVM{kZ7U`n`hjǿSf47UsFzq/Zww+χ/c֛0Lѷ@oFoYiF!wmmFMƵy8VeULarҐHEn!+I2$QQٺ 5.݋bG/Jѝ:|v3V8_%ke& I8.թ]4V/ZzkK#Uo1jiMk1q]uO{^':f~F57p@+FHoFf?Ft\3vZY^m_XNV>z>jίo(PbFѫ.%6tuzLI{`@7LVYF $P`;kD!KvdA-sF-Ai@RS}1`!Iz(̀WY!JK.to6pcQcѹ(&h, (A2qߥ5k/tCfnAW+Eٜ$3$P:\q:5AGWS2P`C L{!cPG4YRL=_vQV)A07&ʑ9 ֢"X嵰El)m ?i7b驦bHY@3RBEhT0\1($ FPN)``Hm A1VFK 5C䤅uP$o<v!4f,r|/4su& #NVAgfkV eJGt6ׯ]_zaBi~K]Q;R9:*ԁ>K*(~swʼnv!xj]-$PB#glF`6g<=-[D+(h֫|~Rb m4. ^\L݋/x }39z_u&S._!Sф>k,0җ'e}=~}8?=;Rg"dvv) j15wˎӿ?7ar9'4N %6N75#77f[LiOϷunU)V7g--:ڰa<ժF2`u0.v9J5lԲNln.ohϷUNӼG6Ӆ~ߤ0 _Ѐfyoz[\|$xT 7#Y%P]32c9 yt2$$ܻ75PVಇW&hs,csq!B*cCUFaMTJT+0ii_9&[W!;;/>\ҭNËkhpdY, WA_J*5/%k#dgmd) !k7*hV:wC|9Q\0T3T۝t0tjѐS ;_M^;i~q҃'v˙g^ Ț*!GNz" (U5*E;hMݺG ~%lH{@fos,ow"0'xۓiivmjwqpəSxh숧V䨷K݂CvhbQ5`IS%+Șp`I_UlܻKjgnovϻ<{Hz^|oO^ƗW8*[9F3; 3;7y2~X,̺qm#Xs'u3f;)H.;ˮԾJ YڻF}>aI/ȸ$],^Jk=j53] GRЖ;y%.mUU@ZD%3IPEeM gn_UmZ8/k(>)wLA&H@3QF!jXZsyi)^wڇTwwae_|Q)"z  p|VEYGRPP#A 96EF\b4EJ $C%Lh\.>̌ K̺ѡ0HFf܏J3,lB?`9i гYZ^?]89~:_9b0hX[L0 k\qB˿{6HGHBy*FdMdu:;+ dҤ=ioK"I[[s?b$_s1iǾP{`4>RDP#XS) *8L%! dhS[/"Ia1;44{I7*31$1Y&Ts1us?N} v>Tq_~<hJ 8 lJYsdZS@˲NIY[  A)!ŦNoLF+Ų󘝖L1f'L=i(T/5Ff܏ud\N.Zg3-y(.Ƹ.=#V 1Bo(${Ʀ,;Zꠘd@>DQiǾh<5<"c $J-+dWG!1/+LjkdFSVBc"L6Ϧ'tO3L>XFCEi HED (p(MmO{ G$ PoR>X|=F$Dqk QyP(u(|qB5 v 4¦zaL%"ʈy J;%P O!z 3SJ`颴T #zl隸O66FB-12&w89-)m훾S:5l2Ša9yR J0bTq_J*6`Ji`L0!OC:a^Ɲ3retO݋0Xt' v; -ߋ A).>=ۋ d^HC%BCJiomјЇXoANu">2lVtMGγo HoeВ* dU I>o{?8"Oh?.'7YMFfa~(s-S'ìeĪ `))bGE4ؾyBKiv;Ӄ)1t bRK$*sRƒFUܶl%g}v\C/MkVXm;ާXW aϴ&Q+'b|ES~$4 Qav8=xs]>8JTl;Ʃ"S!e֤]FPILH?ly6٠2z}bP׍Gh=T FYŔ\6K$eG  TI[Ip>3r|GG$ 8Z "ZBHRn] 9P7S7FMjfw-C%C Yg7 JR ,t ~c+qm@Iز0Ήq&gj{6_!*b E,nqwۢ)!9Ȓ+I}ϡ(-hE[rh>,9YFT0uX]T>NO׻ncX1D?"=umubKlߒw˧{?w<]K||ײy_{}n]mni%EԤ]5o5QRMQT<>UnڬFQ orkKDYyfDF@)a̅d7ҺJagҊ}1O!y W= nWDeCee"gJR-iyAƧK+Q@8J'Bz"^uW6Zy@2sU1% ~hO"O!-dcw1G.s{JB"^ί'yC2 4Ib 'Ylqj ML89DN?9]7Е#Sj6ZK~~64xr'}|)]bnR b҂Cl"@ͦ$J]l.*yRێ7K%n;Tv$U+*2f HsiA8JYbT ,1osStmvIFejSjU9M0ݷI&YpJFjx޲(:i ^WkI2mO{gT0oz(ӆys[YB H^Qr7ex|=qSka3@O'LpGH8\yEY {2xv> +HYn& Av/, yq4_ 24Fs[JL[³ܩ$lDR/'CE/*?o ۇU~[c#!&Dʙ$49uͭS8+a\D:BU)PPZRӕ5Ya꾀ܣt n^< ie]]iMݕE&~ŇJEV3 >͋seNk8P &0⧛o5X쀿&Rq÷E; K2E8 +M&>#gg¿PW*Ro>e{`Ļệ;GtmWZ?VyX^໻68 n(n\Q u*n~N9O'jmt]| ,oŊ|ϡwFCfpDQABئ6y^"Xff쯽R( DBWVЮTtJWwb(rBaTE%hq;M:h~\T(y4 ~׵b){u3Ch3Ḱ4-SS9e,tR6k'JhsY[1vy2!3K 4uf:%<ϓhkb8PM"elQ&Qqc\!`m ETw]7C|xQGL2e"PB*BƆBWiB;oJBtCR0I+̙  Ut(誇t%*uXX ]\IM(thE QʁzIWFjk:M"Xeeo V$/BYjYC)*%*7AewgW?Oǿx~bA#S򪎩.weSo~t7=fBEGa6`. ,h<%t=٫G8+7&3s2$D˩eIn bZGjiisioEʰQ |^XUedR Y?^< 5wbD!J6øpۈ-^+O~:ȥٶڽN+7F6gssy>+C 1RA.%"փ\@(YiUVfDt(y%$t՜z6"JqzCX#:|ܣ?BW>.jv5#k@Wl}je@t ա5w%c]2DWXCWטP ry Qr2Ukb- 77+y(th:]JJ骗tUܥeDDsq.H ի [z0_lūB"{."ڈc6x2HGcOE8s&Y hogl&* )p Ɔh9( eQʐ(+ .S J3UJ5 1RPEWP Z7d4uЕ6VAÑ h2(s/XfP` `%-}nKt(誇te ԎE8v+E(th:]!ʮee5(Sۮ=dU{ivvjߩ?>yfpm0tPyB tCDKM+(.t(@W+erݽT$sGDqa;[*/D;S.F+w4ZVPulV,ɞLB,hED&p%T3|qjjhj(Tf FY@t2|#S9o֘ҁCWjCbHW#]!ZINWRɁzHW!#2 BFCW!'HWٮT%+uT-צo0FPѐl%wjS (a\kS%VhZM%T/$!iXp*Q0Ed anGQ2keH%%.οf5^5yjfpՉk=U)F(-t%Еjߩ\+lO.]5{jeZyBZtCbjC+|DZ ]ZL QJ9U THG ]!\mC+@kt(誏te^28윪 Aw\܂XrhkSx%a 6U+$2 )3pO]Z.e Jm)RZʐl8Xp 2 ]!Z!NWRzHWJjY@t > ʊB;UJK. ,BG^+D;/]!J;HWFjB6BCW}:'?t,  ]!ZyS;TїCWTg# =Uo3cʠl@Wr}ju(]`d0tp ]+D)@W=+~ : BG)]\E-t(誇t 3DWؒp ឺ^3wV]@Xl)=SsX8WQꏻFᡔ.k4.Z1CwF`tT U!4(k&[=Lr(ͬo68ZRJ`l05& F!JΠG7(k+) JuBjxQGRV+L qDWZ@WG+P0IT "pA@˘:]!J1HW}+K!>yfpm0th5__CWԫ!BsiU39y=J1R J tSb$$B+D{Pv-@Wǡ+&!HWV4]ޛza3xH#7t7`UmUd#&Z9֕ꗳ-˲awޕ;/Xso' sګ͂Z5O5'|تJ:\b/ydK&nJ !W[__n/9,MCUJHoڦ #E* ^їvxywVgÙƘm(u<қ[x,;a7~.CMNYUcf!ԩXRUvܿ8N9WS5USgU}*ǘ;[;c8~{?44SJܺ(1&XQ,LoI]eY;ja@,ZA+/[KBMP![%kdDMԴa]^u9E˞h1h龅\v^]l!jwJzNiv6CI]ӔR=FIEPKPb4vcF5CcvW|\L5CёBE{))ϕx>RLG2!(hQJA SE5P&sTaV]#؇KDcU9j{C7*>##G* |dhEIU:{OSi,QMѷ5j',Т+)$qu4~0`ۄ E9nIVCKaQU-1%Bb ![뚯͸\"FUi,֓.-RScHFu ^-X^J9G),XXtdݠtGjOMÂw ݥfѷ#6*1e_h@ZE;68K%- O9@EE+h-a}S683E @RܪC:}C/ ŐgK'ǚPX[]KxP\Zjhc+k.oeB`(*g*hJ֬G7/US ]ߨ^iuT#m|Y+"QJk"dǗ޺ 蓥8COe,l2V8 ]{ #AK~u9A[XQ).wK81g]DSaʠv?a^K q["GKٖЎe>fգKP05. ڪ ]y,cw>|L XxNeݶp$qj`1ugllfѓEFٸ:4%{Bg'UCŬ]ðbƵ6$Ѳzbh4vMfc-4a"[/*3bO&V9x7,JxKt}䰮+9P.ONչfD+9,w[Щz`u!%L 3y]>-v}q` +MkMk $\FC_$o*V1 /(q˜a =R/]~hRWC٩v]էNf~5tВyt5P>]jFL롫@cRՓ+q"`CWk֫NWBWOX9((xL9gB4̆Q~ܤ 8E_gWmvRΧ@C (;iIzݝpʏQMn܁t܋~J4Iܥnb3l{S&2N1;7|쓦2_L/Voj49~Fjs)bceKS7Ld*{*!Z?I[;{e?$bAǮ&km(E?T4~*Z ?SI5ƛ Vś T@ś=EoVDWn=\BWmNW+J ]=AQ ++&ƕn-t< (E~+8g&Z ] _( ]=A ~PPg?mGPY߼ib;{ov;xծ˧?@0.'h::Ɵy\6~:C&_W޾Lm"q850_.Nm{~eǛ]Voj1ig>n*ly ݔӁg,CeeRzoW?{עƑe 6Mխwv=g`20iHlk|HZbwvBKSSzn,Aĸ|0ˍ>SpE Q^WS˝oG.Hdt;,.OӯWsXfq󵾶n:ڢ}o&ÅjRCDcZTSZF[%Uj ?RV4~(zqp+@C{-wq(E#JtuߪUic SCؓ R!]Q B*V1tjUFIMKWϐIk@DWh1JhO8UWϒSU߻; ŮH%wn<<EMY3{eƌf&MuDy|8ɏt: hhO8\* F&6'?0813dRunt$t%)P X}ڛUDW-uRЖ!])x$ʀysf2\јV~,4gHWd "` p3C,4gHWQAL*VY.VڻV]qXKs^Lũ}WU'2OCWǡrIs][X +,N~{qpo ]eԝ2JMZztE95{`ACWMV՞2JݪHWKIW͡+N}2QhtJk&k =bf :K~ؽIޅSʁ~Tv7Q2 j}9pkPΌ[j}Qթi{7U:͚Vr:bhX{AY146hmQ)6{> h]e4.')tUFhKWϐR6iN 9 .?TGt(UlIJIFi]!`~#ďc f ZztUJWX+-,2Gt(iHWFDWX+k k ] E Q2-ZЕ U ]8?"S?u\uVs{ՏCIEr8]ötud +̉h ]e͡ѺUFɡgHWTdj 6.)thv)]!JZzt4O:PʀusU{bqh QuTKWϒ TO=F`%RwD91\'/cne0?iuR4_F!s]Z X,^^/zg,Ў{fi\* ;e:o۷o~g^ XZQm$XQ19y1VGLVV#<Яe<$iʖf:F=rFî e^7z1\+s5CU4GJ^q߄)\ nI%9VҜ?y]녺zwdP{gg0 6Yb!JTb.!*$$lȜpα^NpTXl/OK M ًeZx]ІHBb.*N' >ipBi f-'>˼|3#nɲ{3ljfcU,[fͰTW/G#//n`<^ob쬿صFE/'u ۂ瓝aϦ<=["K薛gqQo*2!T~Cn(&y@~|o mt~)5ù=b,Lo,>G?bvwu +(fW,cYF#fS.K/ ߬ƗoLsDXq}AZz/W> RKR'+1eNю-6-c\P= =z:GSGVIV%=@%-{x)P ZRu>f͡ugzthQm)YUZ/f깟B(FGτV()58"s&O'dQNmX*'烑]e2CSi0L+=3~\3q\] y80\3?[(n6r}joh״Xnxmbfd4u^.we%>XPsVӅ<9[hSa !Hۢ~]\U>U$:Av*|۫딈 Sa H($O]Қ@e3\+#b0I(i`GFu:YPhc"QM`b s[ p&Q PQE1rTV N 4mxh6vhP8T+!W%Xv:l<]X7jNm{|ӫ,lL3Q/Fã_8ݚؿYg m:,+.lBα~.}oC]Cs4#U/2B Y`bZv_ފ:ݑ/(Za2ottCfF+af%,7C=+цyRgZ|~Ą;Ԅ7J:pW(˨Ӧ0>QXG w4b*{"vik d̫Z|C=zbц(pМB L>fX{c. $Kq7m:6- Ԏ[X@-$i$u')UE#"c(iI4TJHT(6F$F(gNϝV;D!Rar.( WZVU+6ъDB8 ,#,rw[>>l$K̏ղAg!!19%rec %4 ؀3vVD:{bL|^NÕ *R%ϕ< QwZ;? 8=.yyl2Y٩;!?,h!R>OY)q Hf8] \ i64||;4=tLᴎJ).D8a zKp;Ghxc^;z0cۑHq,GӒq&8堌 FE%`15c|ȤDQ}ԙ¢ p,R!v-; 4N/_:c%9n^> &]5=;0@X&J;3׃wegVIL0h BXWq.hy uІ8 4{}9F@v  $c|N Ry7MxWK-vq5E1ov8p9Gvm(f ߭*ocbX\c\ؒF\#.[?;_l ^H2Pq6N@2[%4e&K'yE9\lh3]kkyYuaosn Fi8П,3%ƣ/gI/v*a} ٕW;Sb{J绒ouޚzSy}8Ѹ',\yW˂^98eΐR?e&w#8jqrie l>Oʫp:du]ގÞH%_ F?csӛ~w߿ʟ_o@~/p +N~~2__#-u**հP*c:?^~y1 >/#ZNzn&r4YdJ rl{竫"]"yU,u*E38$J#ݿL5硤M Gv˅r⥼ϫ~;lUCHSGߓMXև*yB(CV 4}zpT`PD 3vP%ۋeJ'TNy1 dPHH9cqFk@bKA0TuɅA&c0K!ϒTd.!'ЃD\e Ym8rYDg|:G~Z/ ~a7!]fs~qZsƒ?(󒪥 JDED{T:ǂ`mB=Gu&Af8OKRPG\.!CPFmm 34X*f5pkV H]Nc/V@sEbH731K,'^Fhz:D@Txa,Z {ҟ/!1m|~9ECNXjxeYL91M,Zꤑ{8Kr90)U٠x)xs(нҏQayœqz:o7~l9ij8o[R]޷R Kdsw GO?TDMX*uܿ[t6q̖HO2() Q%i7cB`{\Y化KAi}/˖F[e݄:{( 7G2 Xj˓h_RH-1֜;2HvK:&`J9ʼnZź{G<ʳhe?^]y7Xhl~}>jRg, fT翎~L*CDOTo u?~,c42'WoK$Zb8W]Y$ϿmTrwm^ƍ!7s <?GCp#&ǟX͚ݵAO]bX_G/Cs]U˶*yIyIs%w{׵x,txU%֝uOm`EhުUJR=[Ǘ,#t p >~h&98+eBڋ9\Է_}:h?ئd/4ʹG.KW؁QK,tl iscbޟxKʹcDK vikns'] t2'Ҳ*icg=1/FÔwyŲ6bTR_O;nxu2csf9s3_vuxh PRq0Yii~Tq{)^_P [߀y+fq$ERY>Z͓)0Kdi GμI0*FH`~U%67?BE\{[s*?iȐ~ iǗ8m[لju'v`]zۗ^͢#^9`ri*'oq2[y4!5sqi}}h*iikYxMzю0I#k7uJҥg풽1ۦž~/ZM P+ [KHǶils'$O_V{:vF@aiy4 ^t|^SP!CʠN[z~Vy^H =?(l2DĐ8"(ģ2A9ȄTsCBrS9#9RYLt>oii:eH=&kW5zU}H HfЂDJi G,˜X$&1c:d7k2hJ4*!@;t)֘X>HG 0 ۤb B}43B},A8P!G,`b hg'0i -79ʣjOSY*U9D!xB#Q|6!H`lL'pHfhLSu훮zP-havgа{<k;pR[\N˖ܼ⋱UWc 0YUtQ!M^D1'5OQcn$~iO1^XS"G+lV#Ĭ&+Bãf\ QxԚ b>J d]&lQK/CE~5\; péVOyGbG'$ԏ,9T,Al_LG2*ib$=tѾ]U;^q+AUo9QMbɤyU{ݦѦEwsI:ȠQDHG3oryEp)j) .{='TCC:<ޅ)D+oL̕`RA˯ 't֞A;ûcuxv="xj~Cưۃ `ZbLxU3֊imW~U.YCL<6FruN mwQ\ IȞ*o +]߀Q3v" {XY&Ѕh,Ke"sQ `'YNftLM1if8ZH64'ym~;~qC6HpE˘"B֐"Օ.;U8e hM0Tha0c Ե E_ U4E'-w}&j [Kf.1M1 U QYƅimIMrܳd5!4V$KZ5@xVҁ>%wmYG]trI+1eEOs<ȩ;ȃ%L9d,Z%KdLQ &9M.B'|l2nrx6=~),J\֠]MO8]ye#- hmw]~nga<wT$)cV=n|B֮|4"]Uorj -^Ԟ`et?d) 3A*Rm6[Y+I6fQeMqZ NzJ /;W /;w /;[ OAU d%cr r.559(1E0s4Nȹ .iy\eU\edg#?]э/~n_ۗ?Y6>+Ch\ +*;- ""zeĤY匱!S箣Kt:^,hjr>7m o ƙTWUU,U/b}]2J;U*r;&Rd N.ĨUm4Nl@ .'J$2=O$Cu3T|?*ZWRōaQ&RcF<2Uo*z:y|  iuPfj\к 5+}I-gytvU=8mr^<_PR& b}*0u)Kz&ť݆y-r 6G&TrAs>XV2\jvKŹtztmQ{ʏ.ƞ6@;Pg01V-R_!#³F))5#=R?:>;ϱX};cRؒiٛ1q)$:y=ΜJF֖Θ/!9@)sBX4S Z单\O{~\Ulh ɫ*i}^+=*Qehʎ(>!3GES Hgiy,E8ZI&Y#0h;o" [Msd_q+Y"bYl7z1䉁u$89X 1i.[Uc/8F@,)[ '閌 CYbт1FꝬgՆszY+ws!YȈK2վRD@WKJ5UP`YdIXBdk xJ{Nz\$ qUVE̴P1jUZjIO:cOGzWRD[(eYNe [8`L˨\qAˬp0U oO4{TʏE&6jmNv5ݾqږ1:R:ޏDs8IZdΖ +fLu.F.0!Qn&Gv3æPҋEȱ'*+ߪ{D,9I;EB 1Y"Mqћ MZT,:02A==kd/QMBYY;5?{ϢƑ c3TCVn$ oM|F?m^(!)EgH#rdmX9=]23;Ejfup E",N&l=DQwatba4-rm|S#Sit-!-:ģ딇tkdwJKz#^1N,-i}=iZii.jyO?W,?]Ijv,zZIPRqs(ΎH\W \E\%h9tqP'qĕ֢zP#@Mߺxs Dj OQa8OaT,)ȵ3KkCXYF ~V+|LKa0Xb-fUoVB(kB.0Z\3ruT͟fw!J< I,,Pp&eMb՟\b qcYBJ|RY%AeQ'I 9Ū>!iUP_^:.EI+}+kr>(=e/JFW ѿjdۚjo}lEHpA?-˕v^*,_8=)X m҅лO/yvq1+ dNg7 n%uL1U \*bwK0؝TZ..3 ku1Z]mzw fA_gm 1&Rju t=Nf3aKmF cRca%Zժ aM6bX*aH@0) 4lAbyrc`bҹVrb#Vo*״S2:jlF|VC}/ 맢VH.̤%jzqKpRj8Rt;j^h&fQf3=7SdlSfI7TzcJ3DWO;O'FOkЫ K3MuK*gMܷVX_o:_7qpA)̂nͲ|"&iUs>+(݋@)ޝJVYsI-5.RԽ kz{owZ,&[w"S^ӵ{/d}7ػX҅k}SuPo3HPBE^bwŒ/t_Hv)/(&ܖ/v_|YHhQ%_y#Pi$%_f vŻm# 6h3R`(Ϋ „^GR UgŌDݭSF7I^1(8HMp@c HwX JN08`="qj̏E\%h%g'qLprD*\;c9tqG#R !L]\%( 9(@U!+v݌t\+_rPgKŅr@XxtZ&eS9T >i#I\y Vy9 ܳy;<&uf+= .@|ƣp~ iB=ϫMq۽f ԃcfǘ&CPMcTSaVYj6d |-8NPɩ/TTJyg*"N^:w@]3}-0zwXCX&y->{.E9@PTQ{< 4\ äVDC0j/z/?c4[NXJ$ȢL v4eWc(/ .=F6+X9^/&hx1A)I\=Bq%J ~4*+ձ.ʃK[uWE\I5G$$bcW Z]\%()xQ+ "]quQ%J&f:Kd  jMƾ~isa>%$ӓnFZ̭cQfʍ8GZl^t<p{йl.&x"W#x<ɾگ8&ݞOHIi* ><7sO1 c kd:1He<!t1jN%w:I#ދ^(G~$kE;x ]ab'{ 3vǩBZtUdq(]"x '. sۘ+|.Z{aNvYb.E/nTx5ܣ2C·\KGD<,hڍjT{ꉕ> MqE&Y*8–)4kMwT #"|:Rb&y B F.4Rʃ ^jA{u^4=1[q h֣MNua(*1F@/aĉ<ÆjꄍoUQc8ƌ Ѷ5Z/r7Lwnyy_p<]TxGDJNcRɜRafXT`;d 8` (O [9 +?.5K!Q_Hsr[Bi+łjD$ ,)0-O 䟢qN[ښ0JOs0CD Q &3sN`X$ !%scXpbnikrF#K$;o`4稥q ֯đ}Sח8&.rcM Oc=H`)" 꽓RL]fz9^z_  N[ B\:$UX.lzd ]¨̟ś%2~|6$*B-<ʼyU.L2pVd\~kbX&ĦK'Qd~]QQɮͨjS};X~)|WYURoq7̯zYS߫ˢ~3l]D}-M.Ni3fpV īz:~#ݯU Ex fV]#X0byV­ KS&|@*'} klL9ҦEX!-5G*hr@{E=u+_ !*蹰y17PjTJ*i4*$:n#4aAG&tzRpY4a#A29 @U{0\"[,AK1Vَٮ'Hwc>.]CbW6JWiö4|3cׄXϮyCHYݺn[`[$QEvIں9Ҏ4Tx<&p9K*G>^];%6Q:ga{X+G-?hk6nӴN>fqK5t~_Ԅ*\Լq$隻$-iluUe HU*l_> PMʝ1&gњ\sd9psþB|{Xb,6Sꂨ/N G#6OF`cEg2(:o;cƌL"^ˈAFs+%3iT:h]l r Z f,kzx^?o{l.âtlQm[v]'O[So.i5AV'&)BNf=tS*hCbu^=,he-WvjBZvyx{_Eϵ!Vnots ]0ʹZB fami@.yWm>牪w̽[ &|%^N l]z϶hQgf'6|ű b }b9Ӓ3KC4Hd']N顙ɾ,ݛ0DZAilpGrHS !{׻"JS"j* ŴX ;f/1B!f ˚B,f@qfvaj~~vr,x~qR p lGRPW(7ARUkLY 6V~qݟ!l)d)pX1E'"^FRLN19eu!b2VաdB{] vK %,3c{Z RXKjNʔRmW31a5K$eyZY&s6o˙^]nRpdz]+E~~;щ܃nxEOWJ~NIסU#<#w>_{8?,,674z‰B)s * $9T3s#xbϕ q*\)Ԣ5JIo J T4@ -ޓe! f5L-55[BT<`kbkQhhX!3&u^YLa/705 i$eJ_.?]]%O?  $l5A/s`p: $,dl@[dѣ0v kR5 UK)\!7.uN}IR5lFߋށ^fmSxݛֶcqYRdi9SÛ!g7CΤ:8xr.ə~Er(Ͼ%,pD \25ֿ([R0aNn&֝`ߣ&5@}?i"l3p+2V+B LU=W\bJV#`j4s;\qpmuB ? k TW}_.^V17Am>$)Į$ (8T-1DBi#t]^f hC:j0Xx  ܰb X">x)Bm@A0B,!XtjKvAAZbKIpc&Î4u;o>l&B|ځcn9ez,;%$& suӳxLJPcHQ05)0V۲eVq6V/.z'-N]U %* qMfl,!' ˎ50{!$']?!㟟LJWo?1XB*.&ۤ42{SlпAhZJ7VʳYߏPVY7 Kfa&q{H8mmiU|Sؤv77qu|ٷ E63>&{!Gҟ"|!Gp߽5)-ӗ 0)my0 NYU8^EUxvãeƼ}l=YZ]"Z_wVW=߽L)jľ-Ԅzp$}2uxp9ģuglb-xBِ}!'wM$r/v |ci1r$*R*2l9.q1Y!ŁUW3h+jv/x/.KXa$ JE$sL&hY.:o:G0!No(㴨_cn˕se>@\x O( ?F%R`\O_r>#>u=ɧ*LyD$4]^E<)S &a~|*Sɻ"{)=H.Ӳ$[sc<׸>X;MV)JvCp4HlP%9 4$}Nq)%_Gݢ{n}ݷߟя> Nҡztr~9]u}5%6!/_9eo玞=nk%~E(p=l|WT2dMg P]LH~+ \i~2xW=|XͲņ%/K;?B"{ߌop30<_kbJ'\D'ӀHW D)YLqjKE5Q XTk>aP5-\Pj-'Qmrou,]9@k]S-9C 16ߨy"'h2Df9+~k0qFw!Av(ǃ A_gJӃ*v{.pF V)B\b'J>:4jl KK_~ϯ._z{|5t]<lx= Y#J%s@Cd%x k,>ٞ$/& belR&ZƖ|C)b##ա53;YO}!M3#p/WcHCΜㆤkcUWVr'N%Jz Pi0 u|Y սc) 5_:DQǟḆ+Ů3o:vkuC8q+8?NJ4Ώr4!8#hqq;*No{*ӕ>5f;!X&3Q|/5is 9L;dvJA lϤ^9bǜFz^rRg 4iZX#ۚ2d%Ez MA<ЏDIms\9Gc"bJ4>u)`AH.p Z[t}9n*]ctM˹}'wm䵟Y00Obw|n\//7Dୗh|!5TJ{AW0EB*TlgXg3kj! ϖ;r ظ!<هex4ge9a밺lUqZc4,8 Bubl@>V` jAWnˆ\uJAn\k`JlgΖv;0f[s-b1&Z͙zR'6%CF2-_)NRu6~)&0_9+뿆<R(fYDlsQ]T HBtH9ef5 j'qҍv,r+diﯶ`W0U]/ XZ68G0سG= <?wb͠TC5\c\TR1A21elkgkQپ\ԩ^ S}!MJࢂ֪OB,f%~d߆JF"{lMrvPckvu_1 eVI'V\f?LKn@'v%s sm+9^kz9DRuLVD))Ҧ^5Z$ Ѫ^E?Fs#oW򏓓7ٻ6$Wy~jr7 ] pW+d8P"%RȑDZ2lTU?U]d!gBZ h7 )Sg3.ZT' !$FR=ٓs'w%jJ^쑷^L<XWD),2:zm;=J.v/=tiüD[hʼ橷S?j}Cϴ>:LSuyqǏw3`pPWmZ.Y"M2 qRyV'd1COɴuoє[l:{~˄>TNdȾe2Lgqٚk쮮8p u6&aJ sLzʈNmr1H x =^I\\?\ lk*`v!G v!0;jzR WOȡp}R q@&UXqW($PU}wWYJQDwה }0*+塸,:wݕs._R},S!*uJdpzR ~w^'o͋$_᷷p2mp41ڏwdIY*e'T0H -h}F.'~V^7Z4ר L@=sU qe>j؏=٥r+%H'DB4;fM}lrB ٖgwm^yo4kg-Nͧ7G2E2RJ)tLL2s:[eqH9LJ`r8YZ9,PKDw?z&w+NE Q^Kp$Y~s gWl{(sH~p,.?h.K+>K9;?rlf䇔+qW(.9#Ks*K)h^R:5Hg8w=JV%BrΩv+c1t A1*%~X |iص:^~~rT:/ pJk"- qXnݏco!\CyXr3vQ{^4P_||F^Q'%rw'8*9RZ"BJ.`UڙjAFIT f'xCw4E#$_H40^q```er J,0 KE\wR_ku4/>t[XWs M BQ ٍAy*'8G[AqkNycpm0JRpGY202Fk6;~4yr1&L\`N QF)VQDI)GQM8!: '.յZS#e0/ sULfve64! mp]YYHFz]2S MÜuDD;Q`o>xU[d8i,H|$VjpB>ǎ9D@mm2X$qȵ_RnmWb9/c/DQyrшQpTczTs Wr-;lllls+mE;`ʴ$:Va%@3R=uFњ]%FOҁFA, Xzy}%<%<^ O0saJ0q*µ1y4Ҳg gcU#KGF]?l6Fh͇^NΦm9i/ԿULjR-XTk%%E$$h8i["SL2!FECgKmB4[&H] K5c}EhxYoKqw`pVן}lZnnyLQG]dLE̳pJyV,%VVBaڠ`K{>=FnA%pPc~/xB 1%]BR 2eZ( <2> H<4cI;sCY q4?7: cV>ޗ6=z1)[1[ꅓ-j э洿1`.>es cx:Z;$Eu0ͥDH=$y\bYQ|+ehIT#Q>c"Fr39h)iڝbsַ}gklp n<#9^~ԃ"yTmnyow}Di|^E4JtbbxY\Jy 1GCOA$A:  T @c{r4qB;ӊp~5.Y8s~C?"9Ժ<i88Y?[WWpZS3) ԏ;5_'Ú?>uSk2:%;e8ПPCP8k6eE5ISմ/D]c3h>ӂyǫ\x#ɍ@m@蚪Q~Ð!7#?o6sʢVwU=5ꈳe2)e^WU QI,vVv6>u_r_1 T|Uo&*^ٯ٩_·Ϳ}ۯS{[UZSWӬ\\y;Mjy5<<)yŽ3N ?~9=h8Tۿ'~'?;Am29_zi)441%1+;B mZXYǓ]_(>]^BMpߕ¤TD%@ c oa0OwCW"o+>-䯏w,UH кt[qcO>5En*QxcA5ָiׁ*QԢ\`VoynJ0*_\?iAex: uHe +W JIӸjΡOB:_Z-YИ2>jl>?8e6`L8QzUH3mӨҶ<6)37xk3e?6}Xf")8ÚެVW7'MޫH___ Cټ%h6՛.X&*O peS|ظn45v0Li^I)csc\ww[B&Ewz|ij DWU,ÐLHƀuw8 쥄=|5 KyP?|H‹r2SKt޹2o0+'4Z0+aʎXc Ʀ<҃Ya4ᜱ Zm9䱹!@:kf,p 1PKwQ ygLP`$8M>lqae$,Νjzbz, ˳a^ Fb8ȵޞW08 +V}Y-={X^d!_0o](}Vpo]kb:-FkTAT*6yːj/Z~õb.((EtRQ4 J!"F;dSV$ re4z`) B51Q RL!ZɗO-- LrfoM㋧N^U|h.٨f$H[ A/-bL <vPc zlw⏛V14cZ!~KmYn{}*֕t7EQ{_>=Z8"g$Ag>C6Եwql+%`AjuL`߫9vPVs)'; u}0"{*;CcbC P'"1$?ߍ~7(w%K{[(1uM>)EN;g!Dr "V fOQ߬G7<"Ga%n/V]S%EHX+ YuO H](Dl$DF+#!Euuke~E/MUUGVᇦ-Ce v]HHtaF …#-Ie*KdjQG.lIcBvJˉ,*[&Q0t2)ck[X!EɳZF:,)6dHT&3%G뒋^U^21 m-6~Ymh=ǵX&/'%@JZ=a,/y9䮵U ,/?K7$ww7TUC;U2yaLȪrfGтU̞$K3"&!C~[ܠ4ngV-ZIc2$hE:Ov SS=8U,thY-ܨr] 6 I Gd1% Y3qvԳ^ ْ sDAvQbtf u1ւHgY% -` aP*۷2z;]"B]@)UqB$IpYHTx1D j' ~Jy͕ME-g% bJa0r!^/&$JeA G+fsbOd!r$ x|? &6JAdI8V,9<ea|> AkP ."²]]PWEO|ͦȾʧ+!K TҢ=$ǔ ZG})zd,H(zPvc]:;v4!j4xAЫi]XS/j}nY;;&vƤ(˄!d[ <] H @Ul /F o:rjb(9)D;6Ho.s"{Ox|q`퀴^i!Cd:yʮܝ6T P[R)`HtN썈J83@'"{2='{ҽKa6=Ja68t ;,'DҎ?RA %GJ*A;Im-twBuviU>W6A/ùLMz{ =m1@i_/j̗Wl}wXBHY*̝ZwuEkDs<24Frw*xڈ3丼\$xY2lE*Lϭ]?oӧi+f8{# ضP 0{e UT:I%<8:D+B5B29\n1$>{>]m{2`V.@;0/hkyf$S<^gaҮ<+ä_<Yxf~)G\t]{0PUr0/:c\@ *d(edݓ<$L6Z>\"WJq93^IXn IYCZ %ub:gh)-kV2X4BlB049[T/~^e_QP<~wI.cVFi0El4ߴL$ Ks-M;ъeUaBEZ1KFFbH䈈z)%)y0iPi*- hT("*ȔD(BG~,ѷ`Y30:#JMn;BތN5f 1oNٷc;ތ2{t>7oYojeBrC \P'&aGQσG8ލI4g'l2jzZ[#d ѹ8` tS:%#nqm3*GoJP $&a-./zj%nxJVœ'2>q>oFv-82M~m&(Ħn =nFmݬOX3MfGb]&@Oo<9=p^lU.rSW-7:{Ԑڱ󷳋Tf0:+j7 3׍Ҩ9yc'Q:9?߿Or~W?\Ţ|Qh(c૮ ]N5yX'i,*яW俞LS-qASŖuBw\agi pnGj-ud}:zTϣYtU+\^RMJI2Vy"H@=kH)Nz2ʡgcK#}|rVg>~>At?f?e)ZM:bˬ䓡#^&gcZn?ߝӱ. }!b>4@I ; 0 E}WV1܀)~ȯ'SCyC8 ȠQr' %A:]X ;B ;?C:Bމq!f(yMANY bݭ}!\ :a A=01h_7ds B9 LD0`WJHiBR6#uR8&+OI /FRȋ`u}D袜pIO#J9D$"QC:Z4q8;&,x0zDEiM./g@X&'rJ^ jpΦ˷vwHK:fڬq2{OӋUx/gmn,Ihvrhӳ vPkSosؗ׻ۛLitu52A;lݽmU77lc>.ζAy]<{He0F̿zO 93kO%co55u]n[9}Z}Xoz$۾ys1g <<3>cmQs'՝;@y de"e}/Yڻq,9/Ce@{A&eexa+].Rjԭeoo0=n?49   D#FcA+JY_ɚf^SؼQ|"]Z֟2d"F48D¥ v١bY^R2ԇ ɞQI z6l6{xh˛Z["aN=IR*TOXa% % *4)2gK%FQьcR6UfDJAF]HAJjmselUf3c_[[xPnUEw4czz=M~CO7nmbS CLŢ Y}KDGkkAA셬Ĩ@Uɦ!U>¾p zSk]8[x6k.f_v= m3>R4DRg]ZV30$cf *gR[/2DբN7E'FًN@ܩ8Jr0pJ&BE׀sE8ÙS_.N2Ul~iA 8Xĭlj4)eȰx-c,aADޠTRyVAZg\SXdߘ4Ű-RL19d*K=iTb[fo?}$~ՑtԧXgY]T`BdMtP ^6)+`'`<=kX(Ev/=ܓ LX% ُ/h )ZuP D V)Q @_a!PODé[\PK n0-~ 6\⮩VUV*ի4W^Y`Á6F%-H8ߌޮ93wUfNQѿ~~][oG+\XJ\S$CRS=Rġhj($rz/U]U]&Z~G%UeǔSq 9 vZ\yRjS; nqv,G@枀j_ale }{6Mg}?]Wϳ`g8C:jߝ^ѿVk/e"ԓOGE.9瞗x>_kU]ENsstF;X)rf,?E_{j9 rQ~O?cNre)cؐJJ)xʹz=fvNr8brVFP'DqONI <>h:n:˨Mu9&(޹2HKNZ|`FeP%w\IUN)NڟGqqgݳ: N՗_}3Ѵ>5=)JKr! 13)6fC13w67hcrc]/v9U+&+:}nG{3Xt1BL!t1BL!t1BLӡ"t&:!Cda!t1BLI!Cb:N"v1BL!trCb:!Cb:GTɕ(Nz7?=zbц(pԜR <˛HaD~nZ lwnk!v(o!I RGyxȇY)8]4"B0Rᙖ4iNCQD:τB)1$0brYmcLD"F.2P meRPD"QQu[!S%n)8+ ۥ^ĆTv`tk5KtUncjJ#S",^F8Xf.0` ΐN;{xbl|e\Ns\ݭm Ml4YZzQT)ϗ*) h#EFh{9 \~.ܳ8k99Ur7V*%WYPћ G~B;sD %sȷc%9drzF/FżxvEɏՃbTFZsa?i?;,3#)/Y^rhqW-qfhs346T 8U G0bG@O_g:7G]*%[]tՆu_|M*| (YX1rP5v^Y8\?~뗯~|yqOxS^8Ċᨑ`~}!cp^9KTMWV/jV;)~b^>}ZzUnhe0VRڻXQX |PE i.7v_"T"Dӓ =b)GՅo:'>5>K|{|6dDSˉ 5f\Hʖ{i9Y0"iK ٰ.e={/s)DšDẹaQgA)<ɢ~wg:Lr|(LD`w]Ov>~;:x֨U]ӄp߻E)uK!f8%.9'tQI4D*mв:S. =(=kR&;\PEu1qzŒ NCKj’@.vY4p!&S;_$:p77B-ź79Ii뵋lXڕvƪ7=u 4\1c'"sݨ YvYws촣'Ҏ&ln .$UǴZ-paʳ^*R<ĔN8S&*Tԝ- ~+UK5cZ\gڼp\k\X'ft֦ӳ؆fJ=>X$ %@d9F#D  R0c4QE܀>`jf~{^;+)-p_JʻXl:by{B(nc/wo;B~>7oa'͚Fen+۟P,JjVŚGs̚8ya۹/-lTQa[;UT= ~IQ2(\ԚpT ҥ 󣈬='̮ThtVͩKo-yrѦ$B8f;sOYEa_!މ^!tXDV ER% Ti{wn"ȍMG}Uʱvs}&i).t\\=pzO2\ͣYU%K,۾M{W^+[ԼRrzF[<Ϲ6k~+YW~Ou6^^o0^.EE d^_cg^}ƾ~UYֹƕ( zOPXW˺^-cN KK-B+^ '5β'5Uy|Q2khd[ &[uFp71ܙ:h1=KBQpFmD3t \H爷9RrN $\ʝxxl}ݼMPFZެ] ܁OPq2o>w3x-P:t QJu<{ Q](#ؾ Bynǀ#6ȭwq#e?ki:r L9u,Hə*.M,(L@h5C5jH4|Ĵ1i-ΉMF?} <Oj"v{*yCi/g2+y=38MBΆ(ddDALYJq!NhM7mK8  5kd42XqjkTgʙ<$T9cms?ճ><\>b $HަG4eD%W-чJX?VYOiv$e;Kk"8kmE] 9F,XϵWnPscоft 1,8)V4AD ,2Q} UJXQvv>jS  J]`q<[lA+| wj{_[KݮuWb. hwG/ [ nW )d?11IHe N0<``0Op9Hb"<rD#T` / Q춠P>b"%9"ȜJa)Tb3,i˂5r@tP$^<2bu LYv`/n6S lCC3ǿȂG* %#Xp9aM wɦT| FԬ΋qK 4-Wsbz}6=:'d]&"|` [z,q2%gɧ*ݮp#It^&$AWIf6牼G| 5 PC- A e+._:43ρ}o;2IROhlӐuEt'fMs[A߷4jbB*~ThDfbHkSlUm7Uܴܛ^Fi7U.v7-z̓ mumoZĖ⭵d7&#y0ζO ڼO^;\?1~Ma/ Z?=x<̦qP=x7qE}ӫc\M&K\1]jELIdi!9+I<"6%!߱U6Jz[\I||s%.Ά"|(ڶZn`o]`p.g1X~Il'/Wg2"Po$Z`XEQJ T4-OM{RֆFiZ40#SPDڟˉΎ\|ڼyS(.GOΙ^hcK4HFrk`xSb.u^?"c\do>o^7AVb;Uu |]Z1ߨ3WVwRu;NXeܒ|kߎv~Vv.xew9^Qۮ9Z;qam .q(J *4T@Xwjxg4a|: !fZKUkR*z45D2$5J؀ժ BxƦu8 6gWURm6%ʹA]L#d,8,}JЧa.Q!;A#ΏhQf>EB~HE?ņOUh,?H:xòS VM9]OͿ./.>_[e-ܑE,qg$Yw7=(>tyX\EN?mJ{Hu$H4lj%LnC."\ZL-M+ 'w'Nvw'%G6T M4`1zEqv{v߿_YfP`QJT=ݐ^#3L0@Tu+Ùw`L$E?&"lUt! ֟;q1t&<73NGkfz;x~og۳ f/jvt0jonn/n0? &?8A&7s6Xtn@8BF~r}`|7CNtW_$8f\KLX@ǞRsȭj#nsda ̝DJSFSp"粻ɝR<ŀ2o}1ql_eyluо&r_livn4ptqCW}Js1(5R%\H1HYl&lAKs! -~˵Q< Z.Emnٰu#='AkA+-rv=9JHW<[0h6'5]cYR4;{ko::`:`5dõT*X: t4AB"!C :{4]bv #ݞr]Y&s_ PsЂ26M0v۹8!YPZ>xN,/ ?_zMkD#WdnIWWqZ 25VAxh_MrpH@榴o<oBO{uJU;@vL$,˗AsV"r46'`2f4YY 6b!i!Yl:hkA^t/_:Elx,c(Y, pe(9oI)_!j%'kx;. ST (+RUu_5E -=w=w-/~=!XBl&IrRYzl@(r>w葼tn36: ; +1lU5[LP؂07zFI4e Q$k7Bk$әcUɬ8kKAKh%)&K6V$V^+[|WƟՆ4,6C`Z&L#_S hU||kdK?i3dvx[1_zҳ⎭|B)/QW C<(RE+g"O6g2L2^d鷅zͭgt 4j,d-tHZ$% NV;BfKBgQZ!FH;eI-yn%@Ddd(^.%4RYYϪ gC=k7fbjiow]DaN;zFuk% ^Yʅ,6ث@MPo`,Dy2 B(Y !5.xJ#'H\D\TVQGO ZՠV8+˖sb9 lyC`RĜ_LK-Jb8:U#LlY#5?7Hs9IPVYr\0K`*<"ew3MmVDhlש)]Qj"7Z@=XtAB HԠHx@5Lg)ДXibYQO=` W/]юjv!_"8րi/SyP]6(%e)Y}J̖|1%fiRbvҰ{2JĬ1M[Q%xOIQ)QӘcJ>, rHwɾ#]߆NEu8~amHmO C I .ëDFpI ;lIAB- hƧRQsrɎ(/]xØ9nnZVsy\mt>^}Js?,˽;1ȑDd\G`hlƄaԉ#!g |2gv,xM"S.%4Z M9oш**񽄡a{E__Fo9=}lr&6|˿? d?6?dJ[WybLCjr<)V9JóXpe Z5h3Q!h''lü`eOELNje6e6NԢ`5|6T,hG_3nSJR} R;-d ݥj;d$ȭZh C_*En^"~|gzQm|THTNi:x;uY%D.~j$r8 ؀׺)y[b_*pv\D?fu>zE}UwVweu^oaoڪN{dumԻi+\mʢ}?k78^*6'LL3/tt4z˔SJ=t?lI^8w\O6-ꉀ ]ًzH q?a]b̕RA57ףa@l Z̓MI4J2"6V&гN 5,8:'src~*2Ɇ (wq6wxE4ʱqt )㝽B`]`CyЉ<ؐ R`$ "C  1EҐwR #ݞr]Y&s/>*T:A`djd  g]rhNuÏWW]\bTUȪ *!\}$Mk#rQQ\ I?%>ö@VBl dK} D ʲ|L`HIH- M!jVXHZȺ-zȑ=;l.O.ĮXKYe/=bó`SD9eHu%dHs(/1gM>)Ek0DPUQ'bc0wmz菐&⟍Tn'S=w1m6msۥ١Y}g[Y؊DfbI3N&lLR@d Q$k7Bk$әcqZiv4t x"HDJKHL'3&RK/L2 BpzX4XƾгH3ZVvHZ$% NV;BfK7H H,H aL2 d^,0x)-Zݬn$JݲR$OUb+>-/q4^킒a0rXTbPQ%fO%i u5yEwVB]ڨj{*VN:#J&F7*'R2Gk/KtrE8@O{NoY*kE{cZWxh#^0b+zBSLaPgG4IaFWƤL#CrfUK۰^!%'`Rʲ^f.QAeE G1/'4+n{Fnf|2\A۷GAF05+,2 Dس"^ PGy Һ8v+ܨm""2/<`y]Qn"PӝoyjqP 4b@)oV=l5Q@T;G?cC>>9&!Y;k ~5ǣt8!e2,nݤu HTeB˜ldR`XdU FÀ1!Rkя'$[[/sD#Dn=::}aWim i,vGZe2=oHkje]RY2i_SI+r#z")vR08t9M!i國*[x>荨w-c o鉅Y]QyDaۢ&= s+\Y>'b}wr|skӚT/J,~ՆԋjE(E}7pW/LyeG<}Ft8՘Lǘ|!$k)DɻȣwL܋&гq2=xc*`y gz󲽍'JfBfE[oOWٚ/봗onr\dPVl_9XvΔt6V 6&ĚX4ٕrCo$/\w\>AHk[z1.Å(桻 #\O(!gTB,ņP#O^=-Ku@k& Uq0J謂8XaJK~׈FH[!2֪1bbL1E k]TUwRl3\X]:_-ǃz/܏yK5 Q++Nhv A4A|ujVЈE5w뜙,Y.Io,V<+Y{{kjFOFg_=F@FpE`gl}q&POx^T햲n&_iSGyw@k+S!3^a+oڏO4=ҴMD\tј*Wʘh QNIif21(Q۴9dH}AnHE}*iA.Qz$SMzZз٪3:P#ڍM;ӓ=sXlA7|Mgߚ"Lo ,;xW+;'{ O8]O kae=foҗã2z7OBI99>Ѣ>"+B2J1`~Ns::ߟ GaX\[ONqDCySSd\>+O|pe-2,"9OXŽ ŧ~83#;E{3;eGClz~&p7a_ \<)E։O|K՞Lwwoo5uO ?|9U`v`<̓{f?ԣÏ3k^ M'?9E~=ɟPU=_{VvsUTމ`ŶOΗ=ٺ jeZuz]j|i RCZVǧ5-fa5cuw<{oEXӨyc|xI?O?_ϟ>?dX,.+=~z_Wd\f&1#l ÀU ??c{H>W#9HGl|CIF(yF3f˺bbgSU;_ >sezaY@Z0T+Cb86,Rfȵf5 =.{:;M&xxnUcvvx㞌h`fzȃj#-[bYh@TvHLqHN= #ԑ=Gesp*?sh~Yd;%|\%TRt!vWW.#::r̷Ǟ21=>%8a~zu ڲ6 /cd+ 00T¬l CV:y%}W ֋H/padll.zH9?1QXtVXM! 5!Y!oEJw@^}icCa0]!\LU ?B-A=ASQpξåHcXP})Y%G J )HH*-́:U"tEB6ٻ|\B>.J4GocعgVMA`. cJtU_A%D18`kEC; g$tMNg@+X&rJ<=߳@ OoV_ߠ Vu#ϧgyh3gf.;^›7}|r6kseIE mzpZtk0o ݃={v0{}2^sw̵9kPa!7nzi_n;W;Hϼ̋Yv%w4ޘ=[xd]nx{ˍ{sOvvs_+ۘk6[̓ l~vfpo> p;lb{WtxPb,sG3<;;}K0h,#Z3)2 G.:vN@eRccj;cwqޡ1?帺97ehљ [&hV۬8q62317*=6F_U m%#?[h?:CfA(XC-{Ew>plP, Th>&߅Tgg.G(K^myWTb,ًgP9rv+&SMedžr,etk)5%Ւg V2pL֗.VU8,*Fe6-c?v0e m>}i!/n_6=Y'~NϾs /+I\+ *VH9D}kB1$e؄b2eV&%mt3bx)Є1w^ùbIb&bhGN' [&z+dHutr[}գjw̉,$PlX:ՅGi֔Gn*Rm ;nùΜFvzX]ѨѨ"q-sA[BIkj$~YqkVAmD1<N<9jRΣl@NQfgRIPFϔx((`nge;[nù"~2rbݤvQw0.J]" PVi к\'5F#xraNv)nұ+E4u<"uIpcvlxǙjLx.t* B zP2Xy Lǒ/$5l(L; Ug+%* :jSb(=W^šuv@ ~(*l*.UEM9ƢS˅KUAZ 8HK&GQF.x 1Mܕ PT l/RnùE wTY=oAgeg!pUšB_IZtVKe _SGey eWuVic,x*$ ڳh&@,`A ,zLWN*CJR7¿.13L[ v]+%Z|n]xf m)0}4}X4xNZCH24~JV\lJc[g|})V$]| )iQJr )QP@rߺ%YVTVx\ $@.ja&RLgRBzp68xr|twՓW~8YĞ E1:`(E%11Jy.[Dq7/-5e4ţJH`1TC;cV֘,NtYHYd(>R-fUZJ\L'$&hZJ\iv5h.ZY*c Cj 年a5)yxf4F[C޵q$e p #A8 ocl x!SbL\>l+FIIHJ#c`K$9uwUb\XI2 *r wӱұC'4ui5"ɠn:6иs>FO?l`::娍,L[eUCF!>PB%Dy{ Oy if ;g)eZXX2v* K*Ljd<)>H`O0z0 ^{Ϙ4gGsb F,>OƓ jݞu6!׵f>;/ˬ$Gx# xh]`p:E0'Sn32rʸ%Y 0hk<`ghM,7( FJ$PtlvW)'J.VtZS'6gD%R5lq|Y&H̭cQ/fʍ8GZ#ĚqfD,g6#l96hoNQU!L"Gh=t`.FdFD7?\ J_]Ǡo7x LAdzCa$4ߜ>h["(ŧG$Cփcg/0/4L3 H NhZChI+1#' 5Ҳ13H5CFo#  DAN5FHf3a p>׈cOP@A4^JJ8('%zg11(aH0+1|dsԥ`ɻ7߭(=fD)p[*5>Ïyv2hiX-U tK ϵ.Z2h0HXG<h2;ly6a+B,ؤHBDXZxdJ :ŽOsya%P;S Ũ9X>eg݁~+f/O}~X<7_ʏ{)Ѵ0CPY/3ޗ~ 3(ѻwo i;)q$UHH *"2rO!A:d]D\#;>{2^3csc̵4x7$׆']m)|RQ[_i?bJ;~:!`vZm:]Zi$ħWy.Y6+6:q֟&':3l (P~p}yz8ժȒdga^/OEY跢QZ$Iz$Z T,ɽ,gA0+rmb`FEKe-(kyθuYzJDoЋLԕ+`B.?/ ¬l);2خ-;(ra{]fD o$ÖgzGQ!d/v_ge++d5&Ϸ4dYp:I!Fs%Uܯ]wBfx+Uڴsvש>H k[e =*^wŝ>Y>3Y$5vz~ytM;,e3KMh )f!}i6ؽE%fQ9Fe_- Gȕ|KLd<`D[RzݡpW<\MkRT|`HURJm- v72~7i)&b􁹼A@{NS6+ D$,]gGߔ&jn)nA+XM,QG \8mxd6QsL n6I>_F 簣3$LGhp1\`@$zB+}XA$6@YzJ.YH8j{+Ł\  ?`ALpmVE@\NKWodLż\"^e\֝g h>Y.FMF'Ye_wz(+[BW DWɶ*$HW)p5 }vD9/Y/{bMdYC #,C4o^dj1<К`>;<8'*ǂJA-}z*)kM'-W4Ui:\1-:~>4͐&z9BgA˨ljގ>/&7 CCZ2S6y t4Z4zQ^Ye,:u~?=d1 :MϬSŦ,%Hje^dU6-* JŨ~KzG<y,![nu '1bePwg.E-Ƽ=s8 nh h >&Lw3m&ZѕUE[ bJM9?GXp.[DWXe$ZƤgHW \&gs[CW .#mV<]%Rvt JBt [CWWb3(ʈU/ld˞xj/C)t;kc-6EW Z.@ȡUBIeGWϐeLD\BW ćNW ]=CTXBD8ョ򎮞#]iV0 f&={ 3Ql:{R}1[媴Gv93;C>t!\0X>^+Gϟ^p3^Ge^_Lڃͻi-h,Jj3B}-繢J(+,ܫ; a]w6qpbJYtcgmL"dY"MdB]'?H'_ER+R3Ey dw1ya4 .G-YBK3K(W;xfqL4WͣCʶUB{ J{?GL" h[*}- ԝuJr}+$i ]%5tV?UPZ0X֬MtuUB+ӕ!ճ+@|28tjrFa"qgq_EJ!e,1,^G7C3+FOOvY̯*hmKSwZ{L7}}o6Cjb%Kjv>egaV]n,FfٍQ )n*yx] SQ!EG_J9*!QIZoV&=RgoGI'J?fI V-@z-@Db&x75_w~_fXL=^Աhǿٻq`Eۮ*5]kJ*M;}ԳݖO#cO|/O/?L~hzaJu`VvWQ'OW?RstS{tB]=z}%zE,S," ڱL563w^ǮN@ d \P4,j5ZEjBYI?w*l*/uF*e FR2Zs )8#Z&$/-P_+egNvxDSTyV`^{𺎹қ37{U w%7 Hmn Mf>(=.bEkF,_-.&F@Ed:zߡY6a ݤ2miΑrKg@ЉG"*Gl:TTPJ!cO,V3A RQ(Æ2LO ]*H1 +黆Ǚ\O?7)I3Iˌ `Qx͂Vn&ՆS⥢Ya96S&aϵ€tםXcV"Є8J6 ZORH,Jqw}~M9PI~k20"&L D!e!x%jИ Tk1hFs+%b#ZhZEl_ANK)!Td)FΝ 9II cN-XRF%Q&Yp R!&gkH!-F{쎤}a&l<0.\2RRjZ[r ,BY˴_IлF8i 8ϡC3CoHBQ 5a$m ֪I>tkcM̨3ʀð?A@[O tHtlB)}`X  I/Hv`e)A@%^#6})F-K!/ڐHM8H#\c(0V "$ hb#!@MG. dwumYƾ[>d2@v_fն`EZĞEKJ3j]Ǣ橪[[] 2P4@ZElEE@ +1cUPQAm> h-݈͡Fh;~2Qh?i(QLnB!VˮRWHme cMm̭QJq-Y'eR\ZЬ2c]QcEK05n0ag{ 둔YlX똳0*T*J5 eo L*eX\~okeU  * +5+0Ȍ̆veBC{w  QA\R}2Qg(M DB5%M%$`AOuJ+j+1w\ 7CA a ȸALAAX[@( `5ePDdBEAq4ҍftآDy>25gAx{b΂Gܤ ݄#Bl coSL3H ̚` UR r 6"38J!\+B4ަ2ݙPHq Ls$e丽`ɽ &Y#<;"JPAH_($6d^i\F*^eIKfQR@}Ju0fd0[Qзz $$dA"pPBi5%Y,  ^ B9TECk>84iA 5[^̈KQED$' GcE( /}U8` CMR-k= ?Ղ[U0jW f=fz, L0 mZI|txḰKy@xc#TYiU2Mt%:AhRU! 0阊'8$;ŗ,J茸pΠ"Ø M2f-B&ȈG!A]Jr AJ./s1&#ԘhG5Ԙ jYf}Jh v% Aڱ", "+P(v5$BjF,F,-;="hyBP΀He#kvqAkl:LMF"h4$?<(Bjw7qVUp*XT*L*BrI|$,1 crĪMPk>؀C`%NڳF& h%A J"-GTjj5t뭊U^ MZ6H*tM>+M-)K `n ڪ68hm^,\_ X*]۴˫\GI=GnvJ''f= ]Z`-Nd4QGmC^k Q$I"eCkBm1&P/z~4l}IeFUZAIE7Brh[SP.O(7"fh8(tR"˕,.TP=`ePˌ`*1#K[Ƞ H;`-iÖ Vc7 #"($'Mm2XGiP'7 B`~?"oQ^1bp0) ,*1"@jŌabY;p`\t%i#*`hAgМۦwZ,~Fw -jITAlR1Cɗ U{&dd, 6#TK.dP?uN^Y%kPa Qђ?joZ 05J!@6(=q Vp*-]QZش cBL M F:`GrStE8 8%mCɵ+F(ЭtF< \T"mM\4*f\"AC, c4 يr)x* i?uS't\- 5uhD;S`]pB&[h#W~/n^.WaIj/CM F: z\2(N~74lq~0 z_}lXl[,Ky r~P,vtbӤ;!0\vX}<ێ.~ћ.swߖ+\j y!_5.oVHqMWZ]^^qVݗ[7GLJж~l?ڸnj{[ 3iDm. ҟ+yn?ઠNx'-n@@@>a tN(q=; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@^'Bߟ޺ەzpRZİ/iﺭ-}ež[|?ry+bg }FsΜO?PF-q@6uw@7;N6#h w;75//U6KB Ǜ+:smr"/ O{Je;~Ұ!lX 5{쳭]h޶a@}m\K-uд{/^{n\w}~Yo#OF] "gvDl4#ԛiFE߷Q4 |lc>6𱁏 |lc>6𱁏 |lc>6𱁏 |lc>6𱁏 |lc>6𱁏 |lc5lFcfG݋1ld;h%#hv@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b';N fFON mq8;ҳ$@ް@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; t:NO֫gY~\<{yE[MmwsOvf,oVO0mɸzJՍq hUq6.q߆ޢ#`Hn ]qtE(b:AB}Z#"FwCWomt;\-V># yղ"VΝLW'HWBGtEC? j)z+Bc:]J"z]@ ?]G?qGX{0wCWa(]3]}K re*"QBW@Pjtut U舮pz+BPtut'u1nJ FΝeLWHWQOaqq&<г8owLk[S2 a-Z/S}x\mZћA؀UG=Ipw>6"w#nV놊u/UnI+LxsU ƺ by ~ބsgzuQ_ܽV޴vƸ6XzT = .w~R =U!{/^{nwvk3ݰѕ^~o,9jj0Z!zkJEz2볧o%x{=:b zjI-B}ajTIs.sιߌv|Š΃I 1q{e%yd2p1e=y>( UJ%iP;]0f6#:RZk^JQwZڞ F+ a+@)ҕZ]"JBW̞S 1=[ُ"BWֆ* #E舮W ݬ]MBc;eꝮ#> ; _0]O; :C9*0]}K:RuDWt4h ]Z'NW++%u=eMEW-NWrnKLW_ѹJK'g*"JBWV4LW'IWQ\sΔz>>OmDN|Me{il6f,u}^0a>8=n;6lCfLq6aTtQٷmdHe`Q7^TJ2dq**%鈮ryz+2̝΃N*ؓ`g]7tEpm0s+BY]"]yI]`] ]D(a:A ZvIwCW7tS O;]J-NvQtC?j/>J1bPj硫ȡgKGtc?0Ha(*@Ws^mgpZa~Ap=Ptut7kӏ".BW6P)NDWl膮$ΝLWHWQfgFQz!hQ"t(o7q}k'D֘3?mvDi ϓakP/gw4 Yqp ps`\GGafP:^J: mO7Xsc<_&ttJwDW)!t7t*vCW@k毮tut坷JvDWB? B^ hs+TA"]/]Wk ]\ ]ZNW22]"]Ѫ[,Z7Mrn'0]}.]s!33:),M8&] F8>嬊ABX{/ R<"ZBW֪tJ8>pPZ/P8wNtJ ]\{+BkPLW'DWQ;c T ύ;7,MZj7yvTϓza}0<Γ:,|w;Ypñ3qkBC]y]XOu،/|Vz^`$zL~}No|@1/)iYr awoz-~+\0]^}oY_p/˛Vo.ouAfe>#?>kH\mC+j~Z/2Bhv ?b5:_WL7պx&WѼt|M Z?;u~[D<IJ>?/2i3/6?J˅U˦J75U_V5{t9)>8-F(o_{lrX BX]ByS2AqUcFWu=>e_:Z>g7T9zTva[luCXw>ְ)z g\.ĠATO0XToG1ouygę"8 + Tؤcswp[8/ӱ;YZsʷSmd9q0cW-C¦ 'cLEeZ&MR{`%QNKyI05P21C7D/(18-(K$FEFIJga>8xjDo!sk#qUTnzOP69.dBx>e wHB&w)򁆇emufWU5\Y8'6b|ݗb>f= tȞv͟7WTHľsэb>=S誋e"'3.>. v8<|p{ Umsܵ v?jxo&.M#LGX^@h!O"ZGi{ĉMO%x|O=lG[3rNl=nb4!c"(:xhza%l-ۇ|/[][6rt3tVYj*'B0"`AAm:nyP>ۀ-uBR*83)y|&}/P:^(hV}umL4uxW10i-y BE)MD2V`TECCe^ < >B}%C t<6C?QxK4X)#N)c*nQɘϢ9,yh\CA37"S!0&|DI|.(M0F[sؘp6n 阉Fq`j+g!:,-y"2Gm41dyt*SGZ&O-A{{&VrQ(!xۻCY~BeaY+a}0MMB#N!nr//5?(.qC*/Ng}]<%C%v1!p /`dŽO8z{w_:m`8豕r.:er?]vzH{o ~PBg a_pH",pGXF8]Ul p;JoiTQE1GǙ #S: >*?OΛkVBӫ_~}xQoU^4|rtӆd )#%GPrԽUȢAA 8/t}PC-a\d v Ӆf ߶NW5Moz%>GO#QO#)N sHȘm19u"H`I85Ci:H{a .Ec=wk'BP}JI7vN:֙:6 GC 4$p^dMz69 LđvG8ve{΃J}<[hyl۱T/rX3ŜP}3ߺ;.KP\;S|B? ur,ǐ?x7n C!RDBYx0JHa$hN9#bk;ݎ!yLAtuP1K-[QNqW"I>uz1"|4X =h[FEгA·jthqʺg[Nhپ?d닏-šCU,b|WD({OZ1Qٙ{gNRx7Z<>hcGPȐab ~M,Ђ1N<+].=3Qum=/pOMf{I) !8JN{J"GNZ0Bq"۳`G7A0-d85>KRp@DS]AᥰBKZB@vcG +kbOYj|30<0d^$IQ%̯1QwsL#P|!(ehۮT#Q!F!H## g֜I4NΔt S$1R@ %' c`LZ@}B]zS!WٵS?ѱ]|`ޢ*oZS˘W-9#'< O.޿[/xGwtH,]2jLRE"I㆑%6Ӎ$RM9wf')d1Mjb.C&C$RԵg7\2.@<"YPnˬ-c2Uۤt$B<@b:d%G|v&oG1otOoЃ߮}M~V2C}|G4__L?7A~oZ']5Ԙ@QcFF[FOHt{3铅cFprM^h轝͆7a>Z! 16†sxY=SXįe= zΦپZJz9RZxro_ \~"dZC?{ƑŸp]m8|HptꗥQc_pH2IITJ633W]F1e_Ҩn5+YOZҥMkih奙յ_|}*ЀH%|"t/?fc F]sR6_O~igN$^9hb0Q7kr-fY&𞄵? Z-Al8?ki}v.֬aFٺYz[pQp͊73ftlګIY6׏ FRg7^\ҭjYeX}!}mNOzvv_DsQCWR!T.h?vn+nX)';&7}7Pf9{._'3N݅;&3^@Uw#.7kc1#lTca[4T]Z(pj7])ūu'*c.YkUhsSsLtNaqӲ?3YKTHeQ{!:|TUrFd3߮>fBD;"DaV(OR3I kRւKlB{IU =YDHJ` H˜b012Yj!lLeFCdSalRh#,K!`4wŔYzA2čJq@_cI ² a!N7@hU+'l"#Y*i2!%LM:&F$c,;ܿ*_i7.uIKwhp܆;P`Nqr7kt7|xSPM&Q-l2hEFNYDtrk1\}pL@ވMI6޳@`lT.sHzCC Q1c.WpZjS:' 'drGL&ojK Ϫ!$e7<4<)@$X;KѼ;KÊ*i+<]T:&";00x)3&+ 7h Iܪ2/@$UBeM xWo,jFsn܅\B˓Bl򤛹p5#O7ZDNEL$-K&p-2`Q@T Ǽ`L avuۤu&Ip1IOg I =Ezr|qvvׁŹg,Z)ZdW @oW˫7!yMw؀s!D#z!wB^@Ȑ 'S˵^'bV\'rZXN0\xd"i*wvq9lq"N.5_bQ!A_OP͠B`jdV#>]Z\~:;7gӛc,.͆MMZf7 ef6;~_J09.Xs>F-Vh!L)Hs27%xtQ\ tHݍ#XYnb;rd#JR0(cdV\" ,GcsrLƌ&53ZHZH6< ypecuS,XANﳆ,t%9z's6RĿF CU ÿ y]]_Rg">k`wی[z 6:vx&dTcӥjmq 8joHQY P'Gt}a_R5lRoOY8Hї˓դ;ߌ$u}ޝ].~pe9ăUl9k8{D y`Gp%ƑgU>\4 A &:M.\L=C.vUE $]O(1˧,׾蒮+mhmZVwo[u~n9892Y3оj->{(PmC]`8˸rö`EqVY>tpRi #\bvzPW]4M /ڿ߼5=)y\eZ2o`;#3iEz=?㨕|]QnK2$QV~HT5K,7Ƌ3g1KB1SEVIhwڙ;[Ų[p:wmypa>q\GfoimhvݙWN'y{syu1Δkw?fqQ&7ko.ƍaD8JJzgm s yheJ3ШR`S6,}JN=\[mAnC{ߞATDJSi͜,1J9IĹ.ɛ"U"/JDG|\Ũ<\Pa[wl*>~ۖK4q% l +L딑2fA#8Ndm:ہͧ uOلy2ΰ̒YÄc0NE=Ǵ.Q>jըȒ4@6 o/_uc58:^*6'LL12f'Ё $H[$fV2]@5x&9rv9R ꎮ3gv)xʡ ́ 5AuC k+)`R4V kRAO=ƉO5Qk$9[oΎIaKLd4gc2E$,#أ{IuXwD6hbgo IOcN+r[WhBs_UthG*#Ⱦf| J1c#z4b#FTL(FAd71yP$O{+g1XϓM< $+&ۺV6'<frIX`FkRwcr'R$jpALsYHTUB eI-Yn%D?H15(R`'JrV{Y+WߞzaL.s=Л5(HDc,xվR BB ͫAE4tF "PS>?= OfA(:K!Dʨ4rE "Ǡ,L=zBXU ' 8iO'oD|ɶPmr,29xCΤ9 *̑kUG1MWTkV O,`:oluzy^cM7ow{rSSsd\0KTbo<"ev3 m""DEjJ i_l+|͓!,B.R\S<8ՎD+ACDlOeH4cnJ=6R4's%#-u 1?sygl7˼/~/]Mc;xZ9sD=2`n,亣I #P)#ȴD2q2.Ҵ9|2ٗiBdFg ̀w4|]ŅQŅ ? ֕.9ư #6,hÌ,j;G<ROu~&y臛낪u\5_vIe@"@n/ $a~9xߞsJ %x|uN'K?\хUsF+?TMqGWd6a}#|wrKuw}ϿS(̞#>/icg5zi4ӑȱ%U7ͬ*42;ZLr-JYLK#.eq3\f쉆DWf6fv͎07sJ_+ȵj$Qq%*M\puG?mPްz •ȵ< q%*rqVE?bJaj= DWG`ɻ 8sW" D-ٹJTeq-G;]"nvpevlzsH]Mxir0M;T4O\WOmzMGW+j\{qansYp"2 ?5Ipq\\GJԲ;DZp"N +&RJ0 );DZp2DVu N4):3|wQY} &zRB1y_Āiu+2Q/qB_᳂(Xv{Xz^5M}EEUv9q&2[x:z 0.Ch,]\\Q EWW"r+QÂ#ĕInD]h+t,:B\ލ4 ǹ/rWPwwOSɋ:F\@ gW"Qp%jݕq*5G"ƙn\8{we4/fpewlz{ثxxw5I>T4t(j4< f \Aa\C?`Cm2MܒW/+v \`JW6 *2 W6m JjW>\#bcO9#^U m]{IEkqʺ0~2!LR&i؄js%b٘iRZ ۳oRn|^tM~ۊAh[Z;Jn&j짒DKnv= @RWq%*C\pu @3"Qp%jg} jPwB5]"؎ \6(Qg J^puJ36 X;5 DfJ; +ڱT)v3x$IMK jCi*yf WOmzW"i\GZ+Q9\dE 2I0Y? DnQpعJTZpu W+k(WrnS ^W8 Zʪψϴ;eZy|Ͷ51Tt=sߠv{]0o1Cn2!|r"a\$E6~YTep j\C'5Mq%*c\puPq\@sǕcH +qWPKNSU0+z+d" +QkgLT:Zpu^vmsq%CW"7 BmԳOEW ܎MNeVA]MO&ָjJgv+MI ~. [W"(Wカ#ĕ!K+v }\FujsOjՋʒAqW0JԚ8w\JZpu6-+T3ό;eĩ1l_{e1- ^{x̬Ãqiк^/K:gZnئebfR#K#z V1t]rݐ/c2:)V)%]*yα{RKRuMU{t]1A(=}#W!PׯCW?A|/ooޜ*]?_{?.zWF@N~w_%X]R>F \}Bz6Rl*gBr_}|~r#v83B@5дGnirJbJ\t9iL#J0 D.Qp%j;JWG+\#;or2 j;qq^0~4W2WG6ۘ$8X3 7a3TnYpu< Wl8JaAQg vpwlz|p/˸$8С'} _Q=qJ+Mٰp%+(WPvepՋʰ CʰWPkWF5 Wct#W"(4);D WC3̬>QL<ʶգTtjkM6,xq+ # L$E"ׅQ\=i*yr>F8F@I}ѝ-+Qf?,*o<ƫG\pVeMr1ߝ\J ve)o~/؈WFÛ(G_OG'T˖~859p|##y/ ;t!*b ]Um?Q.W?өZA78cKB%Yk\M`kr}Mgu/:Y[!VmN{d.~UoҍZ}_R']UYS.y>;aHbpl4דjhrʤCUSV0g)՚nq`YUSa[ekg, re?-44SJܺ&X,LoI]eY;j̉P!fZA+ aj"rбV4FDMFS)]ߪUlrx{XmԻsfc=w1GTMw N5蚦Ԕ1O",=նC,=&zΌfxhk#Mj!SR+{yh`Hfz^JiuKue(sՌ{`u,KSE4P&sGS0 ="?"}444z9{ٻ"\2*.ґG ԺF\TwvsyJ*pHE,Jڋ Xnj(Ϡd3ſ x\S>ͭYɛJjjJJAuuNF=Cca<,QMѷ5b',Т+)$q  6=&4H/*mSZ:SjR"j%$DH,!8dk]Y+Ԩ*m{-RSuHFu ^-XHɂ3{Dn+ Q{Yt7B FGYŞF ޹6.5q;0Q1^f G*mb# ܧX⌱E)ؽtB? wԝlܶ; EeyaD)Qn!VϾJUdAY,<[]:8քFF_6,7PYG[W&+3\5eB`dl$pX9Ơߘ k9+>TMAP1z*%n "(dJ& (QOWVنp5TM VjNedb MhVxp$OxFE M餻p Uˉfb 0+lWN+alU{CZ##MABX.H >Z dAPDdEi}\1q>MFޚatXYp﵃wT ` * P#.%L w(P% bAwlÿ/J0 7P:S Cpneqx!^kQ)l~NoPSHHXή|P̬Vt9k(^]LI$ogaD3x_"?V8x4iAn}-l/ $4MBOB[o s@1YeV2)3,]I,Vd6H Hy`1&v|^"Έ A9J$rA1F^Sː> .nK6/d!y $Ȝ՗Me :Ynm[`q;Fo /}M,:ޫjpjK0-"Vjd@6dB^ !Hb"ҷ51xw]q4\tH&+|mXBwpSn{e؋9HbR\Gޅ\CGP#L϶A S~E,Qe9%dmIJw yt V(WfmbIL[n% -{v! 9RLGJ-kqHB_ b 1ȬrsM=M,@'n6}iIJ3F*Ve<qXmL$g$#E/(Q]\D/p 8 vGx_1\-J,"ꆅ @aJY6bXBm‰>mz o6{JʐqViVԀȬOx"R -z+rD(X`;j[n:hI)&Ő-XZ&1ڹzll!${_@(h;>[,uV<+zךBj$d G. 1ie:OxGFз<br80)ý Xrh[R9~yB1Ckqs NJr}]XA,3R4) S\QqW= --XMuEaqҔ*SeqrUp{43b?j7V^00b`R7JQX#ƈM"`$f>yzL` <\t!FdqcVX{uyS71QW=xJQ ORUzgRLF}E0վ.d?uN^>dX珛 CTR^b-”X";ku" AF6m|CBL č01>dNG=.=uG!cIe(T<ₑ4 to= \D">;ͦrUy0ti z=fAM+ɿz$Ctzbc/ xpO-U\{R` AxgJp N FLډԋ _,C-n^.RMmXCWutXsR\2C[W 0ߙdsKCwf>XT&&/[b[`7Y |n4SE/5_-?O]n6Xo/vc 6o5ZZWK9p/ͿPYv+ji=_7̆:qhS"N'|>nXO65TIפOhS;JN M-)D؎jl|3J bJH VIb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J Vl@NF)8v0#ҵ"z{JY tJ b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VHcX|KJ Zv"Q^ N V%@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JUs0%%юplF D(+R "!V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+G zġ,/Ԕz)wk-m_.tlIoÛGDi;%5q%*,Q,\:o^!JَΒ0i ]Y%c:? >|hf hSc^3+| v+  m}02`jGDoNNWB:wu\}Cki0adޕ?ut38]\+[+BPtut!4DWlL;tEpnmwvl g+r!`d3tEpM3uftE(=IWa^midں'Δ;Rf6%qV7e&N-n'.}fޞZ4kl'eDS/T:*J~rqϏ]曎OViQf]/Fq . d[#ye\`ԯx(}u;_W2YJe/7_<Q|C/bZ;UO07%|H;㎅rEO$s~]__wU3n>I'S_")a]1tFlmtp!{ OG+L2Sf3}H);%1ֹ3).2t4^50}}lB9oes7:h[!dko.>"pf7Vb3r{W˱f6FZ+NWB;Bd:BJR]jtE(b:C B"tEonp/ 9U>* + ڴBWv OCW#>\)l8FvƝN\up`0Qt:>Zm+M3tEpi 1v"R0]!])il=u!Ahc+Jˠhغv+uQBWz (Þ.*Ww52M qAMmRlԏjb>FжEjcZ}0nk+]٘76v?Oo~۱(X]r'ۮ.]fL/bmݠ[Fe_(a^KݎApm n)gB {geXl "KOW23+-p؜|ǃj ]O]~|"ʫ(bh<(ގwVVО ^3]#]M7DW|/#㧫xr9U4BȖAl}3tEpc3tTR9 9Bc7h9OLW'P:1t:x?1APU"Fe:GRVH},Rϼ;]Jtut5DW\hSI}C)%f(Rl6h1S.) 11!X[B~o!1gGi6=2~]0B lgXzdCRI6ugY' Lhc+Bҕs2lOM ]ZNWr/c:Pņ;w{jah;]J/Α6DW9AǢ+N0SOWR3]#]EmlȜf+BP4t% [cO8> 'N:9M@n\t%\ $Vh؊ ]\Z+B{:D陮ΑT1J%j1eLWgHW:kZp:[+BPjΑhŽU';`F]DoSG|u;'iF::&'l'u*LB O-To fӭUU\>C]NӺ/K:ᾉ{9r;+_"Vc]1v04nJ 1J޶nAZ/YJKv>_/%; YP$thl Ȱ+mN-쟵O@y\#0\OͺomPݟ^uL3˪X=`*sC(Q\s{t2ʦ,f?n[r AZծח{m.DIAcuuSrmPVTJ :bor )dmu*T.P"zk0{=12 זi6~ͦ o oAZmqo?r=7.X]ʻ-t}{EM];]~ ɬﰸo]/$fot rYi~5] %5E$˪C . VIBRRQ9ۂITp bs>P˺O ŇB8 m Bo{_ܑ':A6#ܑʓeT%I^"0Ϙ띕8#/ sH:34F?p'˧,7_7o)TY,#[azlݳj92  )wl=ܵ=ݿ*fR"vڸVLL__m ^Bލ1+!s/Gq~Iu# ݬ9A7I?zjH}>oJK]=#gpɋd77L'wI>ɦs/JwR^YCއZob:& J*!GN7҆WR|Vv? ="NM03.Boz:gᦧdtM|f:e~3Z m{Xл/9~]4 QQ3..X;3NdN-Itʍ'A$,g<""}v 9+L(k ˱Nfa$&R"Q=zIF^xUv}L%''AE"gU 6բ=oWXgX#)yD%&W2TtQ,BAEd 3BZ[1q1p4sd])#)xrUUC ~()d|)눅(U 8@wOy8&7@)dR/Y@ۍ`4DUK,ʔ'+NsA%MGy(s7k5Z+Ku.griMpϱͱПCo= ߓ tҁ֪n+*WXm5A/s7#sz!zRIv9jD p$mU+ܔzsvA{oJ#sLd Oǝ|RJ&<H e-ٻFn$WHN|no- 7[Yr,N&bK%$2e= mU!U ";00x)#/ٷKqϭ SERꊉPh.:%%Ԙ0h\DB'F{j:<0[ŠB4s\^ oB|~6l?ib ^иLSJ$KHr<;6*Ml֞_ǣ"&mrpnكer%rB.r`$iY2kDڬ^9ɾiF0B?\gB&'1I.Og+R[+&_dTtX{8[Fj2m$ڛ/YfSr+1Kr_+j1͸YY>05x$>(57yO?Ρ=<4=<\Q{Xv7|p=n5~\O}-Ont~ r&ߥ, 1GEKuh˹:sArRHJTYfLϛI}~3&^6>^Ζv3ˏÖ/KaP 粹}9yGxwIE,<((i (:ݘ=Q?nYnso)־]",`Mo)q{XP[G "2#04C6Lc""I.>b6 Z|>n<՞Zfwf-_? 6o/z؛Ӓh:0:Dfu:ϓ5%O&Xeke00Hggt#}0fLp|鏷 b >\_&|1Qs!ʆF8 =b/ }pt:זofrQR\ɖK|e2ܼj2^faTQ@[Qz$w׭wY{khll8]v:ЀrFD hEaT%Y-~Ȋ ҸNq%.^^}waч3IQ#@EQJ T4fU&=kC?Qڇ cQ 8%ϟ繀X9W ޿?τ̰~w6 z\F#9m6O|UbJVQl0*8Ԣs_=<`(jٯ<`_pʰ?8] s,"4lryY b">bO=>J!Ǔ)m ӖxJ>6a#If}fYJ5:pB>6Mo*Z g]_.tOymO0ow͹C,B6w>|8ʅME\,ҢMsM?=Lw ̔ EXp-hJ{f#}9ޝ.`"ТEIZz\ <2Ljm_\X|aœuʢA*I t*b3gx9oŝbpq#!QTdHn`ɌPgEHNfa _ͩ(?XJ[pH*JU4`5VtYyr)hAf9r&c}8!54JBjq`BՃS5?'ACE\y_a? Jm֓SׯtPow)M V{izq"{ŽA8)'X!qgcazrW3<"_FTPrk$߂M9D(bV&8'P' uJIhWnx0v=Z*`4ȭ>$!08܎p"Mztm7MX|2i4"ގ*3;.yP+q0 hKJ{: PsSp4HPN6\}WR)k%g7רR`c8^%бRπRbA g:~O 2w%+,OH{.)S )sz'٨c_ޯO>fuh_Ysn4{c7xI\@BBϘ xxI\`Z\T5 FD'ӟc lwNw#ڲhgӣjըh&isQ|ei zVV#^Ոf5X w&ycPXR(N6SYA,/xԌK!ꞻZs! SdYkoȊQJ/ YxoGXC-Fy=B]<[yZ§_U< 畠4Fg xWD9qWΚ=9Z5x5׹j С|Љ|!RrI@$  DA  9bً!E|h2sBUN+kv>r pdK*}`ΠBZ0r52W'sg ˻ĒRy4?۾2Xt;"}2xuxuK ErXb⡙~P#!+MSx(.$Ɵ#;d65 XwAL$,˗AsV"r46'`2f4YY 6b!i!Y_睗}ZݚMy{ ςeL>k"ӕ!,G|2aƛ`}Rha>bZ^9]OսcT5/)M-=w}%jhSR.f~TŽ?+5JܠW%p;'=&ccیm=m=J3;p`jA$2; L⍞t2gc*U#d9H@ɣ{uX2m0r DV%)&1Š+4VreKe8C;6 渲I0[AD'qsJ+im|ErrsGeRhR0yPd#E+-MEb#}bNj8[ i.[ d ) 0#%%ö4s+%JF!RRO"vVM=O̖0w(j Üwd"'$1Jj y^-m *+7&p^ӯDy2 BXg)ָHA("q5scPYYR=zbX=Ԫl'os%Gy']_˱lXd"DePd@2;*S9ā9#C^050il=' J61 9K+fLf3!Qn&魭ފv}Lz㫦"l|S{.HȲd)C& ,frVڤXgdֽy3;0քo]jru!EiXN>4mrgo btkPJ $J(2 6y`U`YkQ%5CLsS1 z"}®_}f>3w](Are4zcW'iҊH$0If#q}+^tHiudqļećͯ|Pn^fBx\uS"J=|L܋xͬC/uS<Ȧs &6nFjBk>]ڃG`vmKo)+Q Ա(OqqԄ"L_FƝjX"1QƢeUp:\VvH՝dp wk&yvX6)Z7 ِ39g^w&P,sqH)zťT"f.s䄧$Ůp^M=Nݥtbo=爩dLpPƢVFzÃ{WƑ@X C}ȇ@w?l OTH*6V іIH1@i8~*A#f-qΘ$g3\I0vRE5 -6 WL IfwI7Y!v @#2z 5Olqb\e74V4nOmr=ZIw{Y4vF>\ۜdrZЬm.l. jFh {DH" =P"RDSn#\xN*qCB)GVfTFHJ20ْzw49 к;JpHnf XR&Bqap6FxMxهUojNn&a:  FǑh^FRxbX(%w 7{AmN^g݀ƞpgH@ A DF#&a*IЦno~4+&fSX6h kێ; vcrUJ0dDK31iGsʨv:XZ0K7 4C*EV$Ʀ0bm@F^FcĎ7I`jL`I3aR;ys]pgwo ^դڤ䡼Xs/N;^xqqm7pR!aU#Q08 HS,v>M:‡a|;&td.5rǍ?om-Np㞼~iYS@,2BK!}.:Jkĉ.{am.m]ze/\[rAc"(x0AsJDbki0͚aRkRȁwϹL*&&1Q[AuKqɦM6 xͰ|PEg"fN`D:J! {q˃=[ꄤTpv;eb|E2-G64vxieb`Z%9mAgmp"+dFSaF$!).q}xFu>V{i j{Ьɺhۈl_J )zSX[ATf2ւPFZw C4=Ҵ{Ƥ03>LI\Pԛ=a)1LJ肈QCBgn";3L  $% Am!p3_M3훀)ԑ5Lʭ7N+W'wiZ:X6މCAQoMD|baQI*- }S)L`@q6AMXLpg%5` aPNX.sId xK[?[(z #Ew /QH6pacK:ΫM9PN>r0~bazWOn+WHF#yyO/WJh&I/lT(NèˆװNE-cB)>hb:nϳO>zvtU1MYt<_,=9R Gß՝ӷnA~-YWkk5]F۵լ+1 G?Ab/*>n=ٟnj%kkedsFuM[a䶭wOKoGE-\ Y-F_[ U  g?:?;}|~x7g)3;=7|=0Jz4@>TOkĘJa,T $Ыچ["3Zr0R2 ٻO`CCb,tc1mꭵkzXp+ 1 @L<^]^2(35`9@=DHÜlҠ1cv4j"IJ@OoulpCillXKY͞6q{څ֨soetו30lh<d!Rv_Q3Zt36Lr&?d?j4|Bv㶣r>.j]bLBŔBYd\{F&)낗="#[qyC*zϣ[V., \XᢎT:P8w$7h]&MIL9MXqLqK?Opqp i[`jU~cσ+<8̵\ߛq)v\.v'i̓iGLLFPHܧ4 -/o2'DAUKo$ޚa^'iIUv컂4 \RBKkR2D'R(p&>1"h%TInur]NOjw%3H^[&}QB]g?{k Lw۪ ߩz{R,㖵(%7Զ&%7eIɍhW]74%7]JCLɭ}QG٫zs3,)+)|ǜP}"M2~Zi`bя/϶*^dv,\Zvhٝؕ-T@quYrBUU#I;/o+k!|L'C />g_mM)ReVduù[,JhI9pV\2b.Џ`J{gypM)PB j%NbVb RxmԒ+'@KZ20(N>Fx}VᱶVF6C46CVwfJfi sٞ$+%i ]!ZNWR5ɯkB5UuvnPHW"LǯjBt90Aiܠ2+kZ hi1(;7+˔5m2-DGB5thSG|<]!J::DTAA@afѰL ,#>{t4U 2R ,TqCbCawbd5 F;9?a86yTݦ ,ݚ> FMY*TS =bZ\n^;)F?)L; 9:>7aF1M+ E$g6c񨒢?grTIKMZW+~<$J, >EG*5U0LwW\~WJD D_9-APM^q6㎕{\-BLgG6jG}fm;Wm~G8zav8۳\a7)k4 `;(kS:Km Bw(ئ+ƵlVNU+D[ rtBBututa´0'+Ku[ ъRʎ^`%4"NIR Wk/0n&ų]{@Zvvaj%Fw7yqogl!EZ2m2bfRN8@-C ij]!`[CW׶(6(BC+%2m:Z•-thuOw[vCW!Z#NWv[·HWFrFd "`)ZCWWɶ5t(-Ji[.u{i1h£džφC{D]X}G !{= +]:TQKt +Bڶb{ӮvCHGWHWL i Rk:]!\ JtB++VGWCW\)m :4.gm+D+T Q6-^GWBWՕy4^Fq!57JQvieB6:nQyWh%jMeӴ%NV6;mb݆Swlz;pT6C=[IWf+ JEtgm&)-+luk 2Қ:DL QίJiH+,۳ph ]!}9Ҳ6T aZCWWD~ҚDOAFB5 +D(]IJoWJH%aԿ)WnKb8e]D6~=[Ħlp)"peo\ⲉ!CA1-[`>ntnWǛ{|%I³`܍fWyݮ qVƫwN>\ǚIέ#fpx <*% i!ĩ\Q[.J4Ag8P 8OzTm=bu`!/mվ _,KS>n]]6~E{R`b| \uhR<&<2bV:.yR𠢷Xwk7'A/o.`,֟^ОMW2ae7ap!y5U-\cCp@94sBIy\.y򄝶瘈BsF3(gv3 *o\OXr*l$$,bSUgED{Z !*O% pփ˘f ,KSS+.F*_/QF/L-Ih?y#5{Y~NfVU"#.+ʋ⪢ٰOr`4E=+%1hOALN|{@(sb[=4O@e >eD"()Ih7z 'bd =GI^Kc-1!I9BQ9q8S q2$AXF 6JyY^XΊsro@̖{_ Y:Du%*JpҚE;F'LgPJc)|Ü;@IVӯh$Бi.g%LDt(H]Ih BhaV1f㳝tVQ=Z{]*p9Bی@RIA $ p "@"LΊh`6yݣzTeC/OoY]F=PHG@Dı94V3)Ch!Lh=a,*q&̫ m)?Nts*WE>VXBM6q4J+ h>D--VA%w68K4p kO,cƪ0?(!J/,{tΛ7zwEXjT"+$}‘He@9䕂Mں»PS3G??倔6@PRGHo1cBk9fN_o!w}4X =h+b"v2*Z驎7ɡ}hk QwAU 623|[aolhpw`bx+B%] _ Z}M}77['wuʴ'Nx7=-^xZ6^x V̺yR;ڭ/3z%%/&ˇ*i`w8K%ЖvrįuTBY*aR hg3zbG)г"ݳWQuQ8w97h\H('Zrs*dd:^ 29CZ*^g:@d?G ~su+ZS~#{xwGъ6aQxFG RdHo"Lx3fj>dJs{LFj0/hao\wn*#}FBc>0de'VI`^9uߔ Kz2]z sJhi .CtR+eBsMg)po2"h%TInA7e_r73פo^}m}Ȱz*Űmg˴e*zw&TF9ʟk~pijuZiGZ=m\ΗߕyEwe4P21C7D/(18-(K$FELT˝^哣wes`E&J RmӼ_$T2x98yɑP# ܌p%ުuWpf~vWM`lse^ی4֕o>N]u@HXJP YeA@вӁ3UF ٤YTGm!IaY<]أf݄<4ci#l_vA-+וu!Y+/"VL2NLMdTDu1ym֎JJm]mle"Bk & ~g[:#6mg<%*AZϦ7Qߞ ?PE^&Ru0}3,GpJX,D%$GMUZ?zBVe8t\JntU{GKVXZp/j^~N5odO}˷]2>;t֒ݧTH^m}GFZˢ8]2d]kppGazLj:8Y羚 / Lo\U2@;,9_%Sl5VD+8ƞ/$4$rsq: M HRSX(x &% u`Ad+˘R)o]h]h=*B9w",o;>''Q+vR( R(3(If8cZ&&TGQ 2bU^I6Lϒ"8hD AkML(}98my#yl<۰7uRQ|HN1h@IPP 2RDSn#O\i`<nsI(/)= V┡`:&YDDeR;Jp TiX5c9RLcu=7 vsf{v/t@7MX=]ʍǓ}[D^0$%U*BL:ia?[ z'Ð=3βDJDl m;b4 ª MK:nqǩhm{g=xhy%%ZRǝx ~ӎ:Q!ۭ&<Ҫ ՇIT`K @+:pFYQ #(||L2; A+tb<[~r (BVǑF{َ-خB^d/2Da>T#2F$"%G (Xe73J ^ I)mθ`*QfR=-H! X)!sMn4qd="C['Gڏ3x2x q񱛇…sPJی%S[L#XZREITJ(V'Ʌt'EI>c y-PXh3EEv(=R.\.g:*T,f58WkFq9sVtח)CyE,YDgQF#w4৬yNP[iMs`%yd_*oOxAÊy; )Ĝ M|HHd\Վ#Z.!)EIE]fR @B?N#C`q<[AF3duwU<7Ʀ%@Rn1IOE_Aym 'Z2%$D7F0h""0 OkDi.~'g 0,h(x>2PB+j ʇ tL$1&h͙DLIPE#E d6 Ĺ0 J:=z,~砻]~杻6_} .y *_QZGHX ,8<1"WA%}r:DSe$7OBڧd$GƜGNك-6[}>[}$^~gǿ50=dS?l>+ȏa8U ?8zgѫW?==TLi^kZ ͡MLM⊃^#}09%d9VӇ7I+߈QBqFjaR< A΂1x5 N_wz2zR:Thz}aWd85@4S :uCk$$ȫic]>+Mי64R7O>4Z7']p?9'6 7оX Lky yMBm,su,&Y[8' bx*~Y^ mc mU+XI;/\@fv^`3l?lΜМoQJ#9Hsb7ʙ+˪z S/)BpK$jiJ뫯͆+fwOnQ w,R[chVW+m,/mbFaV؏hho\NY͊ QK{=9Ӡjz~)ޢ1`@ni@$@"rnB'덕IH#HcldޛS)?ݳ#EZ+F =όET@8 "lz-isD)oOw]ծk%fu:ʝy4ߡ,0zp2_uiK_x~`^e/7y9yf/NvE^N%M>H`~>B5|+$P_zz3iVF؛I)ϤL4KYϚ){MǟM.i_m K| 6 =\HQ2$T㼔$2:Zs2*H.GBx帷1XÁ$JB%P8_ }E=UFGa55ZRZ啤&$ BCJ$ Q!(~zeمL⼂R_fTiq0Q3)W\(#*IGAS2a8FIyzW0)x}YU)6@WR 34guYG؎sgz^x! Y`uBHR et-H<^*z]1gϭe;wm=I#ST;@oXJx:P 0)UqC Tg`%hSؠ6\g@}$Jh2Ĺ-.)߸Ys0VE' (*u@8I<(N'Գrה=-4YsKt +qHÜuDD;q ]=~u | [)&R2IHZ9`:G6BC@&bITiަ[U O FyC"gD#:F9TAH\z#s:}_k1k7euΠxG@ƀ/x),Ok/>2\AK+Q5൰ՈF Ny%689 [dnF7r񡠢S2FʹZ)/#Js96PXdg\JʐY[ύEt:!tI$BzF<9|biaZiSbĹNBUvKH>xqv35+yA.(}.hWdO |I*5@4㝝4{^nQѢ){ԤBxMNFf"pYK6!aRʜ ^&Ǣ418p,ޞ"iH]O!e&qAX!V Cb\\Jcd)wZy0/w{ ,׭o?t_D܃`ພǫwWaVM[O0Z2q_O?Yؽja53?F"{  +~Yn nc;[RȒZ})'3Ahny)C*1$QqW"E%=l eł'{]5p{y;ӹ$!1M2gE%E!p*T*VLkPQVEU.-=w# |ga&a` q_Ru&8 l2>I%-wN,Zpj=2I{zMo۳UWiZ}̎a3߶}>0G^w^\hĭ|jŃ\UlG9KH F" YH~iaw\oh/YşZS.V'#ߔ3p 2 كֿ9ͫðcS{}SMs *Tz'W1ly8eIKP+lVYFnL=#H"w^ƣɧKI 6qf,)z(u^r[1cxɭV$ rwlks2ݠ> &&G]!RjI2$zWYWIvon.Z?̏k;keS=?wBRlT&H[ A/-b(#\Tkݙ?hYhj6=7m _CkLg]C GOP`:/M/ dqw]B\,^$K /x  ?~}j2kn4ۇwdE%'T Ud0FV0gQPR`O>/\2H 4ut2f>GU+^Y oZ_ٽDNsyP\:ڒK fu|7~^^o]sFŧo?|J&@hTJb+/Vi&edp2J w25Ic +i-!Rv(*SzTyJ4ưkr0&`v2]]!KTWNLR2Bj齺TR]KTWJuFY}b~kp鮦-bM|&E?iZӕl̋Q}H>x*3HS\%w*tx6׎jua.xڄU9R !R rQ%yCh`"&"zS{S; 0uv ZWwy%zAgX zz*l$$z!KSwh2Z#Q!*ό& pփ˘f ,ET /Ҝ \rg.g#xMv t7/ÄO[6qs#PMܕW>a2>S{ϗ.5ldh@R\KA(= i4Ġ= 42v.g 8m0Er;EO,;"gE$!IιTD 8ki%&$")wTk c5LYdP#4 F-epJ!$ &mHc9댜-嬖bQS6"`#.߬D wN)9 ^ԯh j^u ":WsE z]D:2e⌱\eQ " M$4;VZA0?IK}]x8{ct%nnҜ!$ż4LeA*('ePG'c =Fk-u' $JY5..:-e6oSAH *F axƢm7a^i."]_AWEO"|Ȯ:!-ăDM<- dQKIA%/68s kOGwcU\E::]w,%Aٰ7:𖪾r[9tH̹EBBɍ۔'/7Sm3TG:Y9ZI6Go8"J8*brִK10!t9m_HonU Bv-Ձ_5i=ҒD+>)\]\t(O !a:ZE<#FHHŞTϋ=ޥ00^wA\S)x4:'J)H?&TCN:QSS*ri(tD|u?QCWw4t>zh5U>I`ǂRC-t|LU_Լ,*\DqkyimM^*o |L{Гd9wW_Ty܈ԇ:|G+#iA ?ҷ__jMӃ+4 ~Gk-N0-Sdg2PF,7J5%Q T68ٹn'wbmqaɅ1\_,>:w F@fUVBT!buF Fg/ <2cRDa4!"$ JZŒ,v%}äPHpolg<T&sbiDulWl3,i|{~rvZvZFR"&N`D(PO4+&Ks?^*83)yN"@S:~,L^mU L+`ZK蠓6T82X)(P@q%@ uS/{< ;*g2~ 'o*Ɠ.(6X)#N)c*nQɨFwXօ7PLSKo D-0$"N2sAQo0 r!JaBni㱞:#\p)Jhhg {jo83wg@t@3HLʭO!mʷ 0 'jrD yǙoM[c(5'^`.'9w a!WS}Ʒal v(6^MV'd*#G}yDqD 9$CͅD%9:ӊsÑNn3TG08|XDXO.=$7:"9>;<Y&R~2~qcTǧ?jczJ u{Q&p1W XF8]. F;qNoaT^Ίwsh CToT_x}=;f!FSb\q Λ'] Ɠ/׳:܍v!m3 :ghOm4e8xR'|)f+>nzz;b8۴8KZgedEڲVZܮu감"oA˻qDS2 RwTNъ4C=%z ŻН9l3oKn_Ŋ V!ӥi KB̆yjXA>}T#)N s"(ClѨgdEAKpV&HNXDJS:ذP.=~P=/%FERBFŤf:y@h_QʞN'{:*O|#ev9!_,:'U%myvU?Z?J_u*)fR"IlPJ"+U^g$hNY9G):S.ųʺ?r'P@.HC83ȹA7f@D9A$x>sHSh5< Gg.U['Λ_oG#)WE,޲K[gn5=:T NFGl4UMH͂:Y(I&@0hV^z[%&y o׌uֺY!.t@2D'R(?B8K()Q)FXB?UʬE_s7Paۤ;y䘑{Ww"-,|zP5 `z#C5e3ܔ^/#Td*AVBIj,r3iШu^[[ גoB !j87y?xw]J z!lmmWWaa:]Xq[զ8z}yv *TEVYP9QRʹ w܋hퟃ)˼m/fE<5s3ƩL׍P5Hz~G`?s7kO䐬% NH0{Mcy0&c&#-Т|W+;@!A'09=ٝigMxݦ[k:JM_rw"Dnɡs$wuv֓B-\MK؛ .^?%[6*kˡW.:KmKFҦ:t9 y<6ߐr}z{8{]sۍOjέ~5>=l1ʝj20횷yTۥߪ cKPXݝTiW%ُ ěD1eK &ΦX C1" ~W$ ˓h=>{-`#IC \?Ȁ \Pi6Tnش؞!RsXXp SLx#U^JQe:hm`.9LjTArGGꍲVGޢ2ƐeCw%Q:«n:18UhKE+"XEX+D*$9܍D+!]ժYߕY*0Q3)āKLjbRQDh.Q1BH2SٛE.xYHN|hcҶ@]g3}`sjJ۾{73V&4IJVZ0UB]/$q}`Atw*wP؝<g*)UN|n1T9Cc$ _ǔJPTPn(sjB,P9 *U6DP RE^Ls8u_d4m? BuT(Fi $qńb@[H꽙M^SZ) ,uE5snw+щS^_D("ʐwK,= b+FOR's3iǜf: a[ &JhDyg1Hϓ0< V1Lp B:Gͫ2sZH.r-wױXc謏?*7 ^bmmw_ ÁJ>Z궝ut5t^{=N4 IT_@ U/;ܱC,xp4(~Xβ~9ǜ۟O|U]o r'3ˊ])'+MUV*}A8z[p2 8~ޏx|rntc#BPqCas^AU7_7IDˠjceJ, C6)/Z*jPYK+JΗV V2Tf8c<<ʏrn'kuo nWaQb݌՘Zl/;P$U5=\nXBgC(^U"Q?|ruFQ%dϊpOo>Ig@>9wp>..b2fU + vkhLjV}u^Q&TbԬҙͰWθ3_h$yQ)K:u~qtOy1k I2͗O)(̂CiؽU#3]UP{UP.QߡW7f&!nnjvgGa5<;pW+(e@CA;p3孏_#U즏+N 'qki6d4 QF;~s6gSq7h6!m" h2 :P͓Ɗ62li$(?oYy|G M5FUȀ>T,@bb4(wXFI$#%V/II9-;]8DAhh.l=Wux9G*!ԯ~U@mV]f_P&_+/Y:X=QѪV n]&y:od~987T ϗx/xyЊ]/JS) .P/*dpDRK.e=^glV-==jA.nȆYE[1!~wCϮ@?vɛl6@a=AfEeho~O~PoUcg+ o^Hr5Y$Ǟ/?/Pl;5|B^9_Gs]!`IDg*å+trv(t u1 w p!gZ ;deIcP1xj^'|V"D9>"s 0nɶir1UJ*q/v[v .9 n󹘌>k? 'xJp7K0B_][sQurB́"Wc 6kmx*sgwGIP&sJFƆT樏M3t^#BZL?.GЛ]=K}翔lJjzټlm%xJqBkB@J7BH$>b6{bRǶ^޹2k(^%Z3+a`= I| Xf]$YFPS3:CW.L()v( ҕE pAw2Zz*ԽuJqw3tXWVd zzt!K+|pAp5 ]!ZtQ/{z:te)%BC`+HW*=<*ԽuxJYy[>xbtu;IC+(vu;m-JtuGO:DW܇voV:tQ2+&t-uXWm+D ;a=]=7Uc]et-'tQ-MBW@Z< 8 z(sW{ ۹.Eg>z=ޮr%+ y< -tՅ (MEs,F)B*S:L`s*36HB dedFw@[0 !bCtBv)h]tQJ+)] 9K) ¥v2Zz(MOWORޭvp3tѶ?(ܩtӡ+ R.UCWh2J']PRvf^e9 ?X5m\yV|FVS,*Soۮ 2jsSџgxiڭy%kS~{kBpf.~ݐGԕh4vK~.5&d/NOfvTnc%8_fg_+xչ랳YsfzuON|كTzY~;dRa!bu5]nɕ^%aC잘t#Cg~rlIs$c &KUgr93*7QqJ#Wφ9wW<ԇ@{0gsY\۳Op99"uW#гn*d>eKIYdVL`5]OP:Q5fOBJBoV[rz<b%^US?sGanj=/poHix%RkPTȁbkdr-H̍4FJM6;E &FÏ-}~j춸1kcE{))w{πpOE#Ǐ徺VgGAR+ymUA5P:iʹc0 +!|BȥaI{'lF*>#EG*g@ghF>kNRi>d|HtGڋ#U1'Bk'9ΛhB^Jj[K9JJVĠs )'oQڷbXK$1)P;R3B5~Xko&FXK;5$XrxQkU {KFC66WѓvzUYXO,)Vէ@cZu)jԍMɀ1ޡKut,:kP,xL};aVɣ_fHh;N)>(%@ yo `NBaVZ h\CMpf%ZM=n,\ J#ʃ# 2}Wm ()gRݹ+ъWa1j Jui<*gt0[QO1$$Ɨ3' 5P e`d.2Db۽BQ=z+p*КeM2y)r6_]ŌTUqƬsDs2|1&PTؼ*Ca9"_ {0janߘv6]?zQݸZ5|@6=DV&xs:p<(6B/0uLD$W{HVp2:f<ASXti 9=h5'HiD.0iyu*}C ft8=hY{KH`e Y%cԭ)cx $nGgp6[w*YȩՏU?/AZźm0|Y+"QJ#}6~~=9WЯ]@,M%C.z*ce)V4XBh;K)G; d<Ţ>&J|:&8Bl5`*L`QZӢ]f}Jp MIJ XQ>:AZx _3X-Df broٙ%3[x|#hLg$2N@GPjc7~,F,J351ƉJL pyP*ZxwG ת;y谨 !SWI>f5BeV]DpPk>Zl?vXgHY g&)Fά![XP)ʝVUM ioQ 2$`>f%lmՎ6<cs>|lOX4L˴vùn8d#DHA[!n[m#FOʔ=֡8] Tm ÚZS{I0вzh .$a/gF7-*3bO:V % <%:`Cr[S~@'؍ڪ-L7.Jtr7 2 )cYѥ-HOO(hRC-]` ?V&kcMC57k;XEO_PӪ аzW~RdG;ޝ>_&/؛xz)).7m|<9\m.dW?o6F1鸶Ӌϯo2F6qŇ7oH (B4DZpR.No?q|MGOVţmݐ /±]s3wzza1BeRaWG ?) N' Km[{䪲:$PTF;IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$7 5W`$pK$Pdh@@IZ@.’$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I Il(+V`cj@.Z@($2 dL^@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $IMԽ]S튒@f\ ]Mh$P(ICLY$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@:$[뽾k߼z{:mo7_m8Z=w)ۏXq\\V\#%\:Q }Yz::GWC~ѕ]yN=Y "] ] >ғ:@c[vǫWCWsNWe +nEt5mK[J`j5fjXxڃ8XxU)Oc0jh[WBvWA.J捲Gm6_W+Fs;^CGf`,޼0?^T}sNk~ڭqY _~>s,&;N` OgN#_ؕY>-;؜;_]]p^WxwJf\Eo^ {>v5StSLOfעfz'3܂XuU-O%l ϚdLs&!}f:sOcXLgv &KSk:4\pY7Zޛ ЛY+;lX\h+(+8Drn&kc#Rt5jA :;] ] ]q^]zkh-t5ZtU1R\]^wЫ1@i P b0JI]0pjvma7M# ZG31_W ǡ5/dٕ BWOz&"}\ ] BW@w(IJcU\] N\hwJHqj pwųj} CV銕[Sw>GVi3~66wʌaϷ':ƀtY+Sqs?=b-AR'D0o7aMO\TqzMu-<9zW8jTx TTW$uSќ~&W5&v=Vu^T2Zuͨ+sB)ǎ]GUW{Qk~TulzL+$Su)*S{NG-+A YCWWm2LZ,SZ_Uݽϸ-ײ)'7ŧx^kG3vZr־x1ɔ0֞sx5|lUÞPn' ._ o*SL刈u#TFK&EH 1RR' 󉟶A%EsQuf*>sW$W]2C/(G1V@ /7%|؟EӲKpٟ߿G4u"}Q^?^%J]aW0UYϫ 5}BFwg~+[e}|a,/Ò<8? C}Xդ 6C~|(Gu@jL Fy}piPogVl~ugotݟcҲTZ|SE 4~%04c-$Q:J1H  -,Ͳ?]I\)1a7"MwFō8'r`ɥcR]Ԙ`3`5`UqQ84>@!AU&1k=29/̾ϟ3e8pk1'Ƌ齘"3/MfCfe׃ՅC9}?Έu/C;u kTm)C8 ]trww9L)@HW@;nq!,>wfލzBPޘ&,\#0]P%8w6Բs9ef7zyߣKF[0>Ѿ-l'+eJ~;? ̕ ~1ċgpS|QTv [ӻ囑5r+t{Ytqy]N Q9?j S2Mn`Kwϒ؋KY6>#WsFGX]Wyg\3+&`/gU!Z.Z>V1Zب|PAgBX ,-ͥS2ٗ6ԀA_셀Ad r@1ۯMcg!WĹ`¹=)ĞM@$MbEdpA RR^QB:*TUQhH$ҘKޅa"МI$Δt eSg Yk#i!^c1Gu߀0gXÔ[)kpT7;hw1iiul.R-z,Fm` )ƬAj9;ʵ @Hm=U m 1R%i-un{9;\t~ hCanb =ug9@v"du@sJ>K,hr1C6*b"AdT=y+*;zk,g(i^5!%-GKf|;0L]cI;ͧ#zTMlnoi0Ogig[m{}Pb7-vl',lW咡ּDUx[,K)m=#yʢbjcR[7/:2I :0DB9_`el3[Kp-МfbN$M+,&@{DP ʉEh1j2Gmbvm)vEN_YiL DD"bj#%`@!2b3'E%4n¼jv="/" Y;&px^\4XA+>Klp*s kOc Ɗ0ݝt!JY:[ ^Oa IQ6)MVۖHEAB[lH #/Wl"ޒGeTx46#b(."v)&nK翍-20W pqkW/ nضHHHK2ߏ`0v|Or8{RWl;'G yYvW!dv uoG!HQ8|^<40LGgĨ|èՉF-OnŅ:$sAۼKR$he Rʑ ^&TCN:QSDÜRKCޜN5tOi?gTJ犒I7"^<O1]u$-8 [G|$X +ȋ,"T̀xa"!m1~Rk@20H^z[$&y ET!c5>Qϐ}p@2D'R B8K{E锨 q*ɭ5f>~S~qT϶·PHPOjږqWџX=+D0Ra]h$QV$]Pq*i?ȁ$ ԃ,|^At$5w\ NF5P21C7D|+18-(K$FEDI*˝^ũ5Gk#gG\h 20*2qx*0am,2pZ:ŸXÌʫ]o B:AC2PZ<G=4:98)ww2 wK\ƩJ N^z=8v~= gdEVXP8tBtrA!^ o6|QEۯ>te ?`VS 2cZl로o%@rt>K7R#$d-sJH_/CPBނ?G*mKI1k  꼺˸g8 tә6`<s7TתmQ[/XMw ^w媏/d+ ڸCKXm|7U+O/5dɊma*kǮׄ.V0J䣤M(b[x׹OmßV}.u9ܺ~{.<ٻFn%Wv@fNyN@.I6&dm!#3ou|ǚI3H6cU.;W[6ܲYRZ{=4~#=oMdG73 t닯>2ꝊbgٸJk\Q[jss{㰄$%*_f -XAkdl:#cwJgXe< C:͙_UxK= гYfѠh82K^F{QE($q[~}K"됄" n[yU4/CU"JRJVdhV'ɀn'< M:d*";uXWyd*桠vgq(z%](Y"Vdpΰ`7!h.ɾ[/bXw<KM#T"%IlYpdbb{"!tVtTP3x""*-"*#b(^:960XS𬗡NIY&X dA)1!Ӭ B<ӲTI׬@Zp|άIKAAJ1"v69N!:;byZ!cv6yJ+!Y7Bc3JJ:*00PH(Iq9pP386 PX®ΦMbeoy,zWW?(&j-e{UP٦Ybn6Z1E_JX?ފy Z~ Ǖo͎fV <_~;97m'5ǘ|R 7f9T p ^oYOc{3VYQgUdJȣKqtRQ"fs#fym[AQ3]" p.ozr=]h9>;4Dy>}~?N"`LZt=--|zšGծb4(ӈ\~  M_WOWULjfp뎖մo[O+߷/?x)=0¯'shxvXYnVv82O0_ziG4R m$#]lFnƅlkaŧ,hg2]zv3b8usAlUݣnrۨ[w--nuV:av2͵k *X & _waGiEg /ϙOy߯:?ק |uxhO57we[ kBTz3՟_,W7աWk g5mX!cd9ŢpƻȢm buW «Zu,dϸq;8//G Ka!v[ީݖ%&5ZHV / Kftِ,`B{Hذ;DځkkJ)F+B2+6B/ɐKɖJd Lt:9әP:':0 ցI3e6ɲ 9O^ȓ\2y|uKX,(}-TS|w]G(nͱ^H:R۹tt 1m}FLv %0)kMK'!RG C}),ޥ3Wss53?B1_b^%Mډ!ʠ!.@ 1P&q%۩\,L g-6ESQR, ]PZ2XGa1:;ΞviNn7CCo&c4L[b fLUI /Jl.lG{_컏 TlBȱH4ͬ dtH:bʆ7zDxD$z`x`P%(. 锩> KI2J@ڀp1p,،R}sJfN%. 1Ff%K:&RwWg%|_HwNxnx3츩-3~۬>sQp1d^yIj8Ud.նo5$0TEVR $ng2c=[G y4/e(4xk QU'ّW^6_Xa\P;QֱPN$cTFF;DhD*r3J5 ~TO~kr;_~9+ 0hӰJiBFUYgG!~χW_[Fz5n)[xR{bl8{F)PmB x5.ݨD;gPLz^ͮZt4_7K^cgmSQf= n?:ͣ2 _|4X8U",8_\,.ꏓaNR H+| fMA5Q'ۄ6>YojDgd#+JBiy.EEfkU~7`IR[YmuZ_?."2H^NF|//ZMq51/"iI8Fnm3ǫiG: WE-}18_vJΖ`J++=xg,o 1j%/-We[=@f0yKVYPfni3d1al^4O-Ը+k56WXPZoX;),UMW3Zmݽs>oy;]tvjˆMFN'oQTeQx3!%Q<ڬkZ\KS"%nD Mk4bNW%EGx4vp/#Ay'G1L 3) ѓ"ۤrdf$,Xg am(=GScl: } e#e?C6[V $bM5 3s0 km 0a[(=U! 1gDSVVi1GUB2^D{U=xb@-g~Fghq;3a sFξ6-Nb?{q@xȮ~7!瓝HbB?ō]fw)9_̾H+rH-l~TZ8zv9n]^4|ZJ/~'E(>ųA+NF?0]LaZ=Zy5,=J6&L3AVg\g&u-5Gm7 S=hу?F}#6]j^.n  zsv#޿~U?_6[kV~xŻxro5}3g=W?P.DIUT:\TKkè$[lxOZ9GXÎ='8ɭי4Zjoke*fa8O6dD?y½h*"7وHj Z) 2NCIPJ υXshHJHXdFtt>io0Eh-Si8=dġg7rV(iڻ9{jZ*Ok EcRyEHL,rbh &T ,ƗnBSS #6 G;p){1lu< mD]wg6܂P8+cC}, h3=b>;gE[N)7_ gwc2 AFCt%4beI.T:i9hCJ M"tO-UƠ3^{7xB:d\O>R'EEȲ&x [֘!u#N2C'2x .JP:!b"N04Ji%@;Ei!+C8~/ CChܙr !դ1b~HK,?3z92Q:"(z*pǂAmgs͕7ut2!t֨QHoѩw9 l#jc8W$K<1u}>9wsVt2.%pWYu=Ch4[A<%W*` >BtWr fYW\*ԊW͘jĕdV5;m@MP,r:wGի 4 ɶ eԑE{F_~~M5]DF&g\|jdHR*z!6|-.ӦY̏SЯt@ d3&XMxb igS\F#Z>۵R[Q*WKOotEoK ~z؄oE7Tj~t~N-Okm,8 eT߽|2J<222W\!J Kl)$0x{>*Lt_ړ=p։:F%^6&)LyY܏'7o$ɛ~!e9H{(nb Br؋<[ɓ^Iiܬ 1-Wj|) &%gK5$܋ND2 ]v ҂vLxg%޶,tK B/r#Ƃ25=1j yΑO@ܧcn+5 #@0~p-xg3< 9^.znst69:[VjkIx(JՒ0Kt1l ;~&Zmem/NaO}:hR.xK {_|meFdA=RZ=D}B|OZ? K#.bIꌘf* |/Li뙅AZen bVEHч@JYJd D ީM}gGu8k *򐭕%qȥ5NarY< t,p}y ;[rݪMηg숱88N/7=><9ѦS +Ʋr1ΦqAO!}X6rA38o.OD@Ś猵%֞syB& *~%0&3<55KbjLLvk (}bjp筡+aDVQ@@ kŀђ^ G9Y M X=07N > dC Ө N#(w1=_wޫ`o@ x>l<;|hǣo2_ޮP5]l)zf9r+q8\`!]ۙ[IMu׳K-xD\43g*]StݖzJWe0 fS!dXW{ިy]L;ԼQPIw׷|{c]< \bѐȳ[ ˢ^K^kc])#RNRrGr7϶~[ &R$`e9k:a$V&q&BI!jLd2vh9bf؞%vo-d6T}ڽ}nm J.20 o3ߌϽ.r%j_/Hw*!Cx!5mD+e>Z"9wF/E0Dm}S" ֯DlD5c*d,edvWNKrj۫DtFnJ pYșBȒmIsiAArg9wKď'HSH.ΖKdmW." rq7}4:XB aRn8iJ)K/$G.dLA.>\yyxϔ8 zt~_mNpg~Ri8i}g Oa׋2&8$ ֒T9Äs #x/Z(0DZLɅM0$u%wmmqGCU}HlRg7'I>d]na1)Q7v忟ƐͤCA5`ЍZDlJ6TYӑEqwB jsLEx86k^Aeo|N^(RF[4 G:4FGSDvN@6(#Z"$Lewj]G( @90tB`UAFU)èSZgB 9xrTKFtHW=A|Me"jOq6R?v<[%Y4Sk,Wl.dfQ5g(b8QJDY ZN\ߑ|tx@ F Lz}.<3(0缏~; [^ņ38U$ x;W˓}P4 l5@w~[ K"@MQ<#ȇr2QMdDfVFc&(MP,DYfNBNXdc :jaP^gNR7|Fhs8K^ԣÃiUOJK`s1h[eU(2LRޓ(NUuf f AFM!M.j 6`T27"&2dqbd&@E2,c{^T#' s#&C`bJ*؈5.XLH@R[`nꘫ2<(V"x "ʨD brQ:C:1t3Пk[נ31D yzo0;,/" Yxv)Ǐ?+.ϥӓE sM=b'fb,ȳ*D]piʡ;?AGc.lufOaZ>LZ+aRu ы^H2!2!zO9s;K_-\/\Urߖm.co: ]6mnKnrŭ'ie:mÊg&?,,.m?Y~aZq*Zb/5ƶ زXPgvV<`Wm 06| b+8҆zTsG'v^OӼxR wXJs TDuƚQ 6+!G uv]r%8A7)Ƀ"l"ƣa [\Zgeɩ`XgUͦP5lD1!R%0QHȥbT2I5^ӚшޑlME1 `lft.c,@! yyҷsAy͇_f5v y 3{ۭ hϟ 1rNdDw|6ϕb/ IIHhx1<'e4=MhyqŃ1ƶ}2< TcIb)K q Jg:6_]\#u[OQY.bQ5jf *6o*bRj-èiN2ؠCnNhhnJ7 ( [AȁF)Ǘ0gnWci~6X amI.*sؤoqC&ff>6t]jT@D}KR #/S1J1ךm(ӷ.$7x?78=e{8|qxo:9/އ'o{A)v9FݹhRRB̤?_N~^$&6a}ͳ3O1~I2&cLz!+C%9 ΥZgWL&-~L&ߞ7poߴ g.déy?m-{v+>'ؔ&+L.i1+9KLev8UϵщW/zs&Q|(ooeݵ˗ϓEkWJm I2׭>T-tΓI =?9<:j q#/?ꮦw[|Z!Dޥ*m;ߔ.#1kay]ΏEoK! ˄ˠ^mv,؆uD;Ś?m538e.Tk$ZZ|ui6pmpn&вEZ *`le \[g6vʻb30esknr7/q3l8Cބ"1FXaYj*W/xnw\]9A*5%@r7. 8YisB_7''IH e3Km B5UW%ڒ3AB 8ji4z6s}Չ~z+{r[\ܔ4Cj wV}%8J=G;\ OS!(jͫRj&]<}ίe-NC[tu'Gu"1u-sV.E `չqǝkv:vc2.{;os]%ézVtS<ȶ7&YbˎY9*TbQWKOR*AA@rP'>?F4ye+6@g͟yr^|%xrpw7zP?B jhUcɊzm\2&ZZ'J՗*/$^&6񸅇wh$CW3|y^VTC"ͺUp\Q.*ūR)V;Røe]砻'iB;2j#:Gh֘aw[ﵱc =SNjpgn{9r5DQ~L h0qnFW.AݓAo#]ˤ],X? 牠w*u;;*7[C^b\BJ/ ;W;] ƻ\ֹ'KM*tA0&B }%>q!`5p|(ׂ#s8Dš󅲂CFu`(88)3gh.<π/f  [y~#)p}e]*g0է7,wwG8.TbOi>elQ)'.'QPյ&YB5Pȑ9 36+B@]^g̟Q9AUvQw!^﷌>!j&0OZ*|*MZwIia CCm8=S!4H5]4:\ShFY7ж`RfLJݡR;PiwvϠb^5)kbs=)B@@RUΉX& 5i {Wˠsۖ\=Rɘh%-wMk.6ՁĹ]n6t|5W&˴6A9eU WU~񆛬N?˷S7_zUܱ/_Mg_!'Hg'؋2jUr$͑u1e-1~ǔa4w DG={QDTV{ւND9 I=Y8䪼cLj'HEyNGH% *1@GJ<ٓ:fE.j+b]6TqRJ9˄jI9.=攊\ ݼ(kz>mzh9*GyJ {ϙur2=w})ndy\yy>K裥4 6_7}=bz5}>];h\8 LD˔1Y  yMwC .nIjlwڢ< Ճsv&WO&]Փ['S)e 1RCvH\!RW\wE\!mWJFŕeҒrI~1 Oz>%l˶ӣ tdzT =jvAX%rhE'}noU+6|XN=~y7?z?|wm6pk?:[ z G{FșN\L(!\KKiQ"DS6||0u;L"m޹YF$ kS\\n{snxǹ~8LF SgRWg0(G1% Q9Be)mRD:pڸ(#!%+y0>5\='{}_ߤQ}ūyYOa E_7f6(l}~??ǿnw fjxj/Ǯ>/.GJjRO75gYGmqق:?.\[=ᆗ؉;Qb'JD(%vNSOl %v8؉;Qb'JD(%v@ FL&JD%vN؉;Qb'JD(%vN c r3bwb'2rgb'2z띑3goRT:ǥqs\:ǥ~Ws\Vns\Ys\:ǥqs\:ǥqѓJR9.uKJR,R9.uKo7lB %ΝMf>%˭ JH*#DFyhĉ}b1<2cRDa40'SP/+F(LTY2ŎQ'IQ&73x%Lf |ԖhI]ҮY3f.MʾكrGwΗ8 5k:nyP>G8YꄤTp\vΤ;g(_uPEP.^a3GC)mA' X!QШ rJx\Z/lq_BmR6.ŋ9&h>µ'O; %V&@SX[ATb2*Pߴ.,쟓驶ť7" L¹ \Pԛ=arxIRL( ۅ(ϲ_\ &x)B'"=j& ΔUeHǤ\*).w`V V>=TB t\7lMZio{wMM!CCi0׆Wfk!$kR ^zmCQEFԹGO $C%NOr<ۇfE x*/u K~8nu'Ǚ`8i}wE=§>u.{#M^~#G5r`rv~xRBgdPuဳo=2S>`||]M[ٴ6&_;o/.& /hVh9s4蟜3.GzI S; dHB/IȻG:[6 [>KYd# G|̳wpp9jY,Y:*#G=Q<+Z\=kaB篣qCBozYX} uSAWt'v{~~z{7/oo_~{L9~o߼O8X6&׍xxṘo []/U#|9;j%QMge0V2 RwFkE.M]MlBk 默:vKqX,6"İwˑp9xI[)1˕u,E:}Ti ]Iqb!>&GΚA , ­//~f]s.ZWgMՎ㧳Y{Kbn1ͱgBGhz48jF{j VF-i^tdCV@*@l奷Ub ` JfC#B}Jhi .CtR+eB#xшk:%*EKZpw"T]ŖI׾81#!.!3߯ߖi[Dxk=ԓJqX䓎]`8LNZ\%w*tx FcTlKLiȁ$ ԃ,|NBt$ 9%b ༷T,N%u(p,wVzul쌜 20*2U`h+!$9Xetܲ!tLdW\UrHH00gx{{ܞG}triM0۠rm) lN#p7*IS _8?:$T,% r*5Q!lQqs,{#2-vbSu`.f쓝PsAk|(}\mq©""$dKF붸#} ZaiېџQ?&)BJAI58L^#AyvqϴEi.&׆Zw7ޖMK(m3| 58_6}K҆Í/d+TZը{~|$e=,W[ʒK "kͮ)FYȕ#\/GuZ{m!em!uuն^=N簾yS6ͦClYajʭwjzo|Χuyj~F+]mzuxzWb=O̵ȳ+o9{Yu|vv z)c8б[-֮h)giFb >ݠ Zck]}.QT1+Bղ2QJ;,堹H88η-)ۓoN HRSX(x &% u`Ad+˘R(9Ze_"/{wO$ w;nV a {E4 Z$31-g\I4\f$ϦhrNI6LgIс@p%7FPeZv튜5/I0iwچP䁡FPi[`6l'N2N"1h\聒$AgFxP*g6EߩdTF$*XS F꘬gII ꊊ*(%R]Kȹ_2vU:qCe-pdӜEFK*174@חur=m O~=%$ aHJJ9ϓJ!NZXFg7yAp6VbsÓiʞpgYlhJx%f6ꄺ1aU HЦc9Kl?s[vgܱ-R]*͉pLhIwZ"0M;SS.'2:j! .I7  !2{" bLP!#$".N9FY0nD?(9V"rR$b+hBBF=šd%挥@JPFe)wJucS*Q #PY4(!v,;#~U{silqCb 9-r(a.h 5\pnnHjR8* D+O\ywl<|`Gaw1W x?>#j*8Oq] 5z{Bc~1;k~^h-qۋSK?h| _6#ת'_~|ǥvd3*ق.Lp]4)‘~W]a)8Q_+A)Sb֥HF:T~;KD>v34vJfSoYJx:P,0+UAQAjԓdJ<\3@[zt#zI=V#F(ɂ9vw&%b hhQȓp Nj6,Hc@$z\)hGI-&)EIEAV?FCc[S`qh˝AO? +;q]}kΘNnqIO=(ꅓ -+h""t0 OkDi. E%ɳ FC ]c"Fr9) Q1R9K (xֱaL md >r~#Ǘ3y#lX,;%ILQO;glyz(`F$jAyryb>EJt0Mbtcx&qMN[fyD1GI$ eQ2[ y}P]74pB{Ӊp(Npݱ#4<Qhg \OB7$KlI5Zߙd(},8d0tPWE]{vCwm ^ˆgn d}fd:dp Z%ePB@QTB'A1eĀ>s'lb7W7סlSuHB*3-&E$Dtq(ē)Yf3))$۽-wr2ԏ,- >=? 56i^kZ Mi4%+nw`xSGdaoI=ſ3 Żrϋo0%W Ri}%,Ď)|p4(~Zβ~xӋhOv\ m*:'3N&G(nm]Fn* $IV Hee]&ɸZkC/Qe@=|x?% +Q>b;?NܚcQǑNP|C9.?u7U:y?0z[55SײPq ^RDKN!Mƛw-s%Η%+*s6PzG]lOeW@ܔQs8ֵW+@"TZw͈Lm]s T~8aFrTql4Ǣ r<8`* ӗc<,.mDQɞ㾑4y")#WG|/N` |1abb tbT!?TWVū%nMkcQ]u~ΥB)O11l:3k?0ҵ^6`dww#q0EM!bWOerAYoyYj+fE`!o!6kS C5Gǫx6oE,p2̭^ŦבM=ozUo[Y܋|N3@ [gO3ibC=)*FF# ڊ8TƛW'7ſk) II!wS3V!~ 2'5vE}8XXgFCRu44>tG 4ʳ`5vh~%[HxNG0?޳͌Ύs$jf07J_t,6_em8w7j)g-Ɏ jJ]ܽ/`_= ˄)oC5wL?1wGsh}Nrli?9Tw#u!ʔ"/gP*JƣhepL@^tcFZ3yэ&neӟBpLD Nr.]McS:=玠TΡyFY:n6E_I|*X`F+ =[>3{3D^l\tl@#2F@9LEicp&RzH0y*Du9X'Fe%#hjx v&e+S.>N[$=7n^>?b&d"Rryo}3Tܼ]l2G&0)^B^m8n|Wg'Zd5 gVU OϟrO2k/0m)2?ˍ]WZϗx3q7`~_&~mHJRzQIG[*>?ԥg(|JBfO5>|qxϊ'KD"TFmЦi฻lUUg:~K?v7Smޢޏq_A(RAxQ.fQŎ%wKI(0H7j~^4+4/Nb9ۦ; ?[6`jg۬7\.d N%6 4q`x䝵Ax'Bd $tm]m7U^ft'b4U*8.X.9KgEqngG]M_xǦ,|@gޒ⧐쮏ySR5%as :fR\UڀguC][1^pPC;Tq%YؽYz_V/QZ~YNW/%~[{csUUrc1WXU/QT^a M{g*׫Pb2?L_}\ fuh|M1p7.`7To~Zcd_/6p~뾇xU%& "ٻ6#U `9^'`Ļ]CGD"ʶ_!JP9I ؖ8_U潿]M"60(A ŎԊ z8'p/[,9STXY/eeQk娀˲[-Eo+m"WYUYS] ZYuY]S"sشŖjl3$Xñ2m*B6T.(bL,>uɕPU֘}WWHW4ZRWHa`U&CQWZ]]e*u|Fuer!`mF]er<ur*SK\kTW0RT&XQW\%E]ejއAe*BzE+` FgyY2 ]:%^l~SNZeC>lB~AX%M,Њ1Nrk~m炰UW96Gh-䩾}C mĕaJ@*K$4R4Tim$q.Mc5J'Hݛg1ֳ>9Zm<_0̟n1)Nטb CMWLDf*mJf*mJf:O*VMi3UL6STi3ULcҺ*Kf*mJf*m^5*;"D*FJߖdKߖҷm)}[Jߖҷm)}[Jߖҷm)}[]ٮW}J8ij캧QkvTiTZ_u udmF0QLw[i3UL6STi3UL p SN j[ud}ਥ4ªEYٮ{+1\Zl S:㰞$3,_r٥RQ8)*=eRI+txF*RԞ4ȽKiP{Ҡ"msҪR$hIyr$H!h'()z)4dn$5tuGCSr<,7uϋ^:c(KGrum]Q esN[uZ[:v[b{R^z^ <ϖ,i 7_S^ZsKJ~MJ肏5s;N0-Sdg2PF,7J5%Q T68y9ͥ5̖jL2@[L0Z ;7ap#y7$ yp(gfTBT9?\M.#yڍ瘈=<ݮنy-fFd g:UFXiF%3J7W"#똔4QE`Mza)7LjMeDJ0A qs.AE`2!壶4EKvEΚsmO.VjGg-rgDwBGDLt@٣[O4~&K bLJc܊q^m۳=ѸiLkS0-N t'BC,ZcPI aR5vV{_ºskOT1,8\PbL`DO;zD%& %qua}3 ekzisq鍁rp>A"p$C<&xO# Q B_&AV6:,p`j+g!:-y"ңib }~Lʵ~A}~;VV>=QBc|_qf[q^@Mpҋh ԌK !?Wr]$GZ7=ĉק޽*S5h d aPN%9:ӊsÑkz*^o'RJU^@|pڤ N3pɧT}/O#!\o_g]F4w򊼝6C''\UJC :7ᄳ<G2k'q[ w牦w&gڸ5cs|E`Msi?;Ynos7 7)pj0|+iI֙\>E4}m[Yd(F?o1f9񪇳uVFV:{ȶY[w-An0~xÏߝxS?;qo` Q+ jkrw`e%k*|Ь*z?_{7?;nΣ7|*I/a~y7L#]L~kE&C%z!f/ZNb#r'ElTaD=#p9xI'0C>r}hEv>D+'91cr4lYD 8 " H/l)=6)={JZO) bIg:y@h_QʙN'g:{O|'a|ν:K`U Ipе5 4m<wk'!Ab*!<Wyƞ9e}kBF!GLUO}%q\ᢎT:;4O.zCAifODŠ>;I0f_nwc>WAͮEKH&׋YG{nx^XѮБPOw$XɥPzgj9ŪN֣lf+|TMRe(VH[N&@ (Zv"[ymd>5b}WH2!%!:2D&x)2"h%ߧ,/-H9;>n9Iw>^7҂貏Nvԍg%9lʞs\8oNM'U\quɟ^_]:$T,% rh*5z1rέXm?SoZn_̊yŌ}r|vg5AFh[>!Pٯ5?s78"$d^zo!z'4dgT4Du)BQR   꼸ϸg8 δ㉽6ك noJ;>W7M$rK yvYlU^>k 8盖7.^y<%[6*k͡*9KJnzV]l֬:٠ 7l9<ܽzx|A<^avl;+rîzFLkpj~V7q<*Bo䫏 <279y'SHg|Y=/;^k3xvWE?LӦ=m.N4ZW་CT"h-;V-*{ˆ9)\-&2ɐ+4Ë.Z3ʶ44_nqGnRfO!;{60ceDhJ8%f&脾$MDaT1anL݈FðbӎQg4Eb9o-VK$fߴY#rUu(CQpIH*M 9.KYȣXdǘByȦk5&n<\q21c[189i9GĭL4U! iM2R@,(9c+QfG4%뙳TF%DiqS:A 2θ<t sQޣ'̈́F磤 #bcFď!.NKmZlLKņrbQ¬.8 %a^T>0N`rDR 1:z\<.ӎ%qa wwbY9n (2& KY?| 'V&B R$N/q}ƊЯjQ_x9ǩ7,%JchrHG6HuDP4Pn(s&;$`%N͜e8aG^bI=v# dd cӿ.0] --#JOh6B#GBZhi $N&:84DRrERRZd)Zh .HY#=&xLXs'$i9v,G׹.e(#958O!Yk8ic@^$. Ŷ6MJlΗ`DQyr󾀎Q2Toor !"Zdc8FRm;UtbPDˁF>ah/q9ݺ^{ǿ@sD+tࠤQ H$ATNE>-c-xV2Da>8AEa\u*(:Gʢh6@ xȵ L"&nj6 i(3Yk@jH)1 4&Ξs,-b7=Y}> >Bmvݥȇ^.%5{W 9L5S"ou\H qPo-)& mvPXRY`MrAZk$bE1lM4%n g=;ι&'pYgO-~bpٯ=˧,ЦCP`HgJeq9J,$m/T}S,8!Rk73YG?eM,{Q鼀P&PZiIsʾLv_q~&qˮ}A|wR910.jGID&(8z"WMJ``aW[}q<[ܷ}o۠Z@\4\.~]9_/xƤ\@&ob(*oN #d%$d7F0Yk5' ډ$)i. E%ɳ2@ ;͓L5C:&h`${1CMК3ə9$F8b辣C/b>g/.*ǓF~tu?B& _-|3 Uڛ?@^%^Tw. YeVO}o??{ ~մNm"o2XX]g}#@Sm yai(Wq/E꘬LO?ZM]ۓrR!ȴߔqv9BWw# ->/7>Mh JϔZd@;AݜyP-j(mL@5-3Ewydjt=twDvmg޼x-](ӑY#TM_dA]{G 𞰟Otd:W.cˋ.~֝f_Mx+:=9ox9UoUE+1`JDe FHO(d2 |il2R0ۯ4>mY$y~D m5p>T,Ĵz-iU>A^zp [Rpw=OD `\ȴ>;9ACEc<+P:bR 7FƾHcIw,'QQ$|V>Rbd9 ]JyY.`:9!41qUh'nO`&aK)J&=vq^JQ`D 9Јitm-D!W{C }Á$JB%P Hc<@*-5FGa55"~kW+A;6DHBPF"r/Wd+nUtDˤ*\q|h:&eMUE!$_#/肗.uj2ec r2e]eegRc3Tq[gY )P&,ĊJ5Kx3f)3`&VATkz[_Uن.k\Ac ByW ׯI}Τ؛8;8Ⱥ p)zƍHY}1Vb#׳UPR{1FC4zPAOXh,fR*.ʹ cw7N@%%g VE^Pp* cYZ=֖Lͧqˏo~!S --#Ig$s^0#؂!%bY`a(&+,#K~g'2$St)K* \ei$ӄN)•] \w3*KkX Tpup9_wW(!quBwO UCkWY\#W(_R2 •\tjujVR~sB0p0 }bzQDu/0lA!?ˊ j-V~%UYRϸ`$*FMΈQbx37 {#bзEZ[+^JPn~Y\w_V,{o@wqV7U%/0|@L-]ж 7`:+F 'w:=PE0bc(Kheq v6R =;A6F%q1 bCOU ͝lv|Om?dw__/{cj}V1fzt}U3qq盻EmN1M6oK&`B)s 1D>LtXg?#vv߂R>$&{gbD{Xz0az;j cq9w gDJ9rtkʜѦ[p[+QHĜS0̯>/ a` |_)VkAMt(>U8ʇ;KT 9LBz?*PK5s6a YvȺnS>rzF@l@Ӛs`-j*(-m\~]JkIʉP@X dq JJ+(=H KU$pADڥ&3p՝0Gii=\e)*/3 ez<56?Nc__>/?-E~6oS>"?ڶIגctS(Ȃ+ܳ^c} e.YSr%h` YJC^Dq]`Mx2tpITJYAH:e6B2=@IttJ%DWX0 ]!\ U&vBFeC( 4`]!ZIb+DwrJz^¥yܹ J=7]}ܹ J6U=%J]!`]\El*th!v^'DWXs ]!\)S+DkHt(tt)O@d[Jd >i J CD=+,on"=ot;a{FwM.Gz'HKy* Zm9c03U/]t u(DY)֥K1%bٌiL`Tf$ hmuR}$t[ ]\U*thu;߈=Еb)=]!`K+)O-ױٻG J%DWXt J ]!*惺}+É)yWXq ]!t >ҕɄ t.#\ ]ڝdnOWLW/NYBkLWap3T-/ elT]LWrk#ٛ 5$ Q2j .IT4BZBW P*&3]!]qu:N0p-I&DNWReGX"iТ63@zu=. D8 [y5hd/`^awsUyPMo㏽nz3=…ww?yͱ2p);=y`H/z[@P|^hG8hw:~K g@ȑ ng&1G+:(EӃ5@K :~/ǷWj"Jm NU{_;a99:vAfjmjێv4 t 2S+rGۊ/"➹L(;{Jl%wr1E wgG@<5[f@jǎp>g<<\4SE B0c{Uv,94Jk.{?9@Z/7q1XCQ|荆Qi !tPdRyr9.Efbl eKvt1$R*va5t;Z6˙jd;[fzCp(>lJ'-I-IZƙ~B (vW_`[X`wWTn pϿFZ!#6) V}4ӌm`RJ9~&%Uk<7IwQ_FH_TyW:ӽǟc 43ŏ*iw|8ULJ2w!kUO3/W5"M]Гmb'>_&ͭ K?_L["hU9-G^ΖK"X-JFwJ'HIDogOhdɩ>8W`>|I@X0׻9 3˿ATJ'Eō%th8fw`< ی2;]̚ܝ=ISOhiCa^J^:]KӼNĽN$ *66 A”lm.J™'1tXjI>+4Y'4jP>k2${wuGeU񹨦wdR:瞸Tu|[ahJS'o83oke*j_;6-6SxlK\$\!˞-n7/H.[X,%dJ&0 2S;&Ujbͦ łeL\rD6R iBֵ&jIY*2=j}.G[D{?t ~Tc~ʹ}oo+2[s_;^}3߫Hȫh qS!טRF z|Ų=qɄKi`JL$%ό8RdKaXJ%rb_:enИdFYK3mQ,Mgi`B}ed=C  Ynl P`2FhA?1\L{?=5&!L/hH[.3?ƍT.2+s98Bpk7??腘>vD$oqq7OXm1׆_^(<..l4d{<&{fo ٝ`TB Q ar$ް(+fAxADgFL/\9SddN0iM$}Vϼ:r|QvA;, bQ $٦A+U>۝ Rgp;33Yt:P3LT'I3V5ޏ F0&H榚gvd'y .n&H!*@}}V*]Rd9|[3YfC}V']Q:$@ v"whBzf0h&ɥ0L߉;0 S5 LBJ%Q- Gjw.ZEKgtD%sZzf?wdL(\̱]W0Zr\~Nst\ JwS ÔUa?e<3̪|3I3Hv,d35iARBx*B̔9킌7Ĕ0 řK#zu }$}VϬ$=(a0"»(adn yfur 7j,)UVgx>k2>gtYrd|VN0ͳϚo&mGJ`Sq䲌DMxf9a,m"YdVg>k'iz,-Ҫ;n<9s%જPK78<~*C:;\f_x?Qiw1EUe!QYl nx5\p=7o#FD(Z."8k`ޜWحW?pjb;rޤ!FUt([5#e6M;jZ֌VV#A I*X?o&usLK?2j3!9; ~;ͧr~[Ljv9soo㟜h3<)%xV# ˗YfVQYQkP5S5+|Mm[G ycɈQ֔ZmL] [?Ԗҧ豊WzJzG}ngNH25*j5LD;+)Bw{ ڤ4ɬA9i:x*HɊ㸣Ṟ廸NEKIKݽz:xGz Dn>g~ˏ'|:t<,Ύ'nV)=I`r',\^G94_^}*<$ka~>u$*&z]gG?5W-%KKZVzdF4DI[7#Ux/E5\ BAx)U`y=8{Al8GoRssJk}"Ňq9o.Ӳ"sqNOn {f3$@: /{϶ܸq쯰rⓚ/[OyI*&D-ER %*Ox)Rp0IsKeK`>==s G zt[NsY`* 7 A2.wFsa8;_6g,ܻR ΋S27䂐/0U` U풃5BF },Y=ŵsP}49Fc̉3Y8nE`k䓞V]v{Vr!7/<+堔·<gm!z6SfaNȦhPI,cީS{g%jWhjg6.lgmeS5y6’yKu4@#2~!lo%+c،B1ldL>lI't6jXْ>Cޔp#r=ɬWXYb k>q>٘ڈB3;ŝiYϥx>s펦xY٣Df W0GzicwiCI0f粻cl1 ؏'0i`v((`M'k[$b#^2zCFDQ&T%)k daaI)M&2ҀhM#&jlb32jl"i Thbkh8Z琁0OZ1id";aY'jlb%z8w<'-MbGz֑Y$?p#d(9K֚2I7À;g(FL˫;M9z7:%fJ=h'A^i$|&,xݍ6Ʊ̋AnjN<ާV7J>PGSJƽF-|EuXFSdh?l.grnHФVP! Fn-QƩa+ޡw ųÍxpb UDFvN G6-䦥{C2PsC>AqOm,]݁!Ja g5Ew ӿ*ݭ?~?Z$:KdA^TQ_3W$H?Gj zVEH@E9;˻ 9OGj*@ԃ7>v9hŽ31|,XIi8.r!1v.^BY{Zy`7Ƞg|8#D]C D-x]]x7{d [^᳇N6o'm5oOo'>J+`Peɀ nʳK%ڱ€jfgl!6qiB?|Ҏh۹#1g=C6❬͈mwlo1q]j[+mçWQ4c( ~uoz|Uxr=W{WLȕdu>t(\{A7;M?{VyvQdimhІ5muf wlwRu\ ~߁%h&tb1~bw0ߑLThSٍ6z>^eTk0m%o samrY۫?y!bn&H r+Sȩ-R`3F$;l~#YWdc tvc>?VNZkZYeV>7~af{[:ԕs#u#&1R+Ҧ@ h&2).ܷ(fuLR \(L:ߐϚxab_CJR{d}RI+7%X4w~ִ ౧bz(Bu]6V@&AsZ-:E4ljSr;Erᓇ?E$FG42:}+ÛYtgGA`jʣOm6#ˎ$!2$\ȾNrHBq^nn6Ȧd黜y\"|ó֌rCKqP:h5o5.VA:|^;l X Ҿz_` TE }$4۸agf0i-nCE\XAD7H+z40uyD+)+T`H͍J!ڋ ka2ŵRDkGNe4~*!DoM Z+ ^{)#4/-o/AB&]%"/=M~:L6Fg3 TKYZ LǷ^b?b֎ Vo T>a?MMl⸾uK)&8+q|U8Ӂ >ЏA#l \sl鼼| ),5}AM&f-c0Iƃ|R7 ,3\7, Eb.j_S@DE_1%N٭,r!)d-r+[ 8<)h]nZJ}.ǹ>8}/! Byb/Y C_$8HW( 5  -Ң`rδ6Ḳ<D$rKU,#p("KdAC'k-П%3::ޜڌVd; yt'w=(sc̝fG _<-] nLd_xePӑӢz LyQ ?r<4' nmt3~ז ,y/ &@"nZP €1-ؚs~7zkÐlYX8Ũ:\ٖ>>i<+= $ e/DBS39NgU5¤}B:u,>~ڍv)c'9] HbP_Ea nבzoq^;SRrp^~76+ge_oW_C٤!_+3|յ NXHC1 RiQ]$Zv+dHxBFhA 앸44!:rf!QBZk6TCۧa:ue͙_G䚹j i"CaÉ0ʢsz|]$Z[a}Ht;-yaU( JRPh =AEr0hv`0-ˀ ȏK/L"SEZo4i'"bH,m.ZcO/K?n1=0ޡQۼ6቎۱#vh 7:[ް^, yqH;Q/[ר=48lJmƟ& Mhޓ*BY77Aftۯ6#5#ۿ:]A=7kkZE3IIpհ6TJ#4j M׆G"1RGwk ZiwS~̹ s6+wL)Le!S+4ӓ2pDZ¬*-]J[A@@oM\u\6\s5+5$iTGR:-_{{0Lؗ7QNUֶ'")°Qн4<[o*k0mmu3(4BZ!ؑ(?6 *`Zd)d/GP%c0(񮦴~iq}Ӵwe#I (,ۃ},3ƂL,YRIQeH)<$SnTJ.F"6~c%+A2)F\ * Hȍ(+$P C)&#Q黐[5khC`|-k1WFm8+w+\=+4aHmOŠU,ua2O 3c45;DF s(@RJNS΂!Su-"qosZG$x*Q!I>?}uZq~ɽdV9948H21/ eLtB㝻G{kψhVJDkB(j;fЏ`VL ؚ1:o,=e+>뢨SE! 0$Ռ jrM4dͷs qJ> ,ZnRw`#s:E ƙM4 P*҄BA?Dc3LL)p,ꬴ!iak|^H51ˍ VhD4Z+Byp3,f4IfeX颊W-$ vX5-?z Eş]O\0,@]Il!,+T݃00Te0I@ wDG-h53oŷ%ob>qg@#H03XLvǻoF7<aQr" *؄k$S(Eﲪji *xL=UYF܀vz!?'q>_M;e"[-4.Exio0ki~1܀!C%'='Bt w.=(4Xii «\l_n#mB- qg8 и`;]u6!AM{nb l|Y9kw5R=[_BdgL^2'bQYuKڦwOcf:VF遨XF{4:}VwCU⥅N F;_m ]rދbZh21Q`X~tLCEH03KS:YݳT3WLKt!Q@lL{!I43&h񑋛?Ϧ$[ŌV&kfzÂG;faTʼn0o?7G:l03Q's`b9(x9|a@_>LPuf5 K?"Nj_k~狛/7R߫S6"?4|PV"w?/<O# S=$I^.okWfJ .(ux.gWKi&fK*&N7A0! r3IL磚*Sc$ W5{pV6`&&սeխTAV B#N$tp|\ݾ$x[ڒfhO̘1D5p!סi>15(u|PBLYn7ro贇skNOj#ė00{Fah=zkcy'_.;Rw`;lz4A]4zˆzOy,Vx>ZH:=>}-5'G߈9ddtbZ<_$zR4OXVh,_ mp\[dz旬_c,=r߬;5swF#q4E- p.#z=z\C}S(qn#q9_#Q¡D^0ƅc rYJa'I3<\˟<39MS0YL ǭ߻`֧p{eRC X% I%N]1a$s HHRYsF>]Y+.&J qBl5J`j) tS`"`]z7?=](H9Ѫ}ccoVyY״ R p>D ;ڴbT+D!sa]׆{ Qշ޾)}4Xlf>_ ^M꫘$Pș`1 .{\1XSVAK,|fu@> ( 5.|:gq7yhy'z[ H<6"V *š&坲V~ 3)8)孂r-T:UZk~WWC{uEO*qp12"Ƽ!@9SV d1Koe 5dI5h dEf>6kGiw(2N׺1y41O5'W9&墀j9RkBY5dp+\gPh smSK(u7 k4Rpl IdZ\F;ha‚A99:0Xk}nE)[hٸvNp:y`/ "ݟg)xȼMBu'XyWUQhcxи;Dpy)&j@IK 9џ0:~:\uĚme`@օ$QѺ\0TcJOog_!" Ͽ^{D}>۵f0+/X58WyUUcVxJӑkUBc(dg[CWKUX$Nu[9-4c`w ]&2St4 fQp!tJgj[.F Ȼ vfuލ J1YDT&Fi<6Ͳ ԉiF*>TERXxq=~0lT?97xvVr 딆 XEѥ; %W$JǠvk"ŵ\8SAcѕD+)}/~mjeΩ7ikp|^uX61>A# ҡ9! z/u1b:1q؄PL(+)AޣsAAu?ŦÊ^@m^G0u`Qa[s": Q) q&͚덥 ˋq^{z"zU1BaeCTG/sXpOZ`8ŧyM$?}C<ǢJMD!6(qv?XNdas1f8A"ByC3ƿ!A\~/SH.!GQs%|J&[8a ri Cfӆ1cz= "iv # QMh<-^6!q;3FBJQm1L).,Ju؋9\Ajl8{ HO׋F}~JfY:Y/Ru:ݤ8˾h*z,ow Ea޽hcY )W~xU92RѬKJ{Wȍ f}40fbfIv>e,˖-jI>A}aI!Fi.Cbx%h"Ӊz 2cE q;ބ&PL^F#J>,L9>ڞXKWcܵ*K/-$6_/l~=\-*_Bkz` \SѺ\U^CaNA/k1Ph;b/|"%!E6t4T8Ⱥ::ܔ6֣<^on }/c,ۈ WZ)OZM Ag%TdӢejx9Dz%+)W=ѪΒjG5OWtex6,)OӗwO߆3>`~[ɭ#L C=8+<ͩh At.t(9gmg'G=8]lnicS̖2 i(*31dy&u$gVR`n dP_[ʉy6Vg)l)!6O +< ]69, wl3Q^f<+, 82Oν.Xz{-u-?tklXU`v1AW^i n2])fv0? ftKv&ޜ[JФ|W MCe3MK lifȷ\Wk>͝IRvʰ,A 5BXQ].h|{rfh~s5vǤB)5ֺЫ|^(B+fڦ-luiO򾸮'E[JWטLӻ ]ڨ.4~pB6cŴţUq{U1#ʩܛ+#ңTKǀ58v {hBHL h:2#W17\3%w݀all?ߙЈ{sadTiQAuHBPۖ| l&Sҵ)"E!oDϊUzG߆SkGsqw L K8(Pf H79d܏G76uhqne6AE6WMpD x!JOJQ]*h[ LS*ۆb.f 1-/+hϞ) űn;[}kQA#lӴ4JSvg'DUF88QN2(%Jݓ_ jVB[)}ٴtB,G{S-NhbFweo hm=V ,Qveh6^PxA("bܑɽCBQMs0ηH8S|bU2x(b2~ˆpx!h͸KVWoh `XmGޱ(b=-xNx؞fµhF`g^E; ]^ V@kC&cLIBeJ? *%/bO07\6T~_E{ب%ܡ_| EF>܄ay8Ip4Jyyp!c/ ~_BTDo geP[G+yr3ncu_c%a/9mۦPk܇<fC-?,/| EٯE`8^E7`:?r?ՠ0λT6l`MIwK9ɵ/JZoX,aT ;-e.,OQ`wrC>qt2Z< G٫:~O3)$fm%#CWL9iYVRJ`8>oȎSI/NjX[G[B+`\}3Z GQ,} C t?ޗ1Sګ[u0uw8uф~/aW( &$QoL'?:j@_%{ˊ}P'ŎMndW;Q: / lhj@Gۃ?qR{|8F|~5B"/}U%$]z-=BpK 94BRO\t~3vy:ƌEBo׶)v2hcquWn6{\ie !Gs8e`g⦭'=WT:Lrņ5"Y5\Xdma桽TV47 ]7JG܋:g1k^:4V[v0()) ʎ'|6%T_ *hTku>}H)yNUȌ,,> ͌GbœY6AYxv4!_gZi KU`Bm|~<RSw-/Lr ()Bg\Z<9* ZN"s<+W%!V?0Z*NC˟uDI7܁Q^:4!\Sm.X 3)G:Ō[ylg:00E̛{N !_0dV񧊢{_3k`C|iq>'H c@fZ`8gMp/sw]Qզv > n9H 95$Bi'Gӂ%`v ݛIz򥰽Ith'pFU@ѺYv7Blqu ebF|rܳ}djALURFC51ß$)QAau=x]悚k,gw^ ekPP('UBNQ K޼7R+f|V e.C̔F097ơAyDӇ WnCeͽe4,F~ +!'Xe@_򄆺Q4>L_V}Z+W8)cB."́Wh*hXw q̅ C>*C+HBujw2^ NҐLӝ_fJ'p:P8,%hB[n|ƓbvtSzp"N=Np^Cv(8ami~n~PtzNEI}LV5StL14^GWON-yq7Iꏊ멠}(pLx\(| +Ѧ~J FʽN[?]Z@/aã\! fsmݷ)&;|ÁfQ rA|?un!M.[JDQZ ҅<$\,2g0,o.;l[]cCFgi;d!cTA9dz瓑u\63F #У+x926dE@}!tȐeZ"K4ZyUft]22V0h|\P݊dc1J5o~hTuJ2(*o).>IBUèZ4W q[oHmvpwbR*h NhCzw]yq0*.YAHq 1:BnŻ+6Ib߮3!!ր,3!-,^'Y&2t1t-= bP:AZ/5lXoĦqKBtG57s*h:eOxY05dE4n !XE +7N(,Дr] WϪ Bb]MWKKRkF}/TGl typo`f2 u"ΘCá !% k %_x!^6KI Jҵd~\>ʮ2D䪠krIN_Rzix%SoI} f}>?jJ)HsgM.ګ4tr5(Wj^x9bLGa4zbm=_nT:*~k~e+:tLŹjakg~=hdeWiB绨 .!M1au5o :})&UG<%*hLh K7cĞ3jukq}Jcc/,EW8EWЗ- QPA?X@:LH(1֍߬"NQ#ؤŰpHVJ۩pܼ &1uٽIhVAϱm*͹ 1+hPF<cvZnÑFNyLR׳*J"4 EES}4Ǖ%?b3 =5vjTtЬqʭhFVթeT,J6zjxRO LKSl!4p~* j6|4 (MCtL=´!9GEvH\(޴J{`uR6SSD 4eo9G&>QNtHȒv90no+^z -Aq/ PàP_6sh*9 LS7c7>Y%HvԺ܉E4kgRؤlrIli|v 7 ȩd 81}5,@7R_̢͕o=9!BW4<l"dwmXV?v 2A:<$MԒ|i{(vYR,W*S!|nɐ&/1w5P=GRjv68(aaGs-qfY?N\fLfFCna){ pUZ҉bD4B ?:S_阻L-P,HV^L%Q|~_*gB)s7^AP[w\/GgYs @) #ou{Okwq(:Ç90};DR+Cp8*|nG[3NK8Eؤɦo:XwXn$R9cCC`_ƚ̚ԅgŤ|%<İ887dAqIn2]^dy/ɤxB+,NV<ΤN,ȫjl:]Tr?2Bc̼7wXl3济dԇ2jmW&,=tw0"h֤z!eyYP' {Zz4)"u`OlTiu !&|!+ĔElL_ ėz=\G~c٩ ن .8l BH Tn^`.`8\tÜRTFFF\YBZ@K5cJx{:]J/l½0 z]( }]X}N-R4 >^!*2.{hK5T˸[*Ir0,ދX_bC.M ׬Uf1͉mOnOI])TjfLP$\_ߝߤG“r4-ƪ9ѱRTAc @ n2')M-dnQ9.s+h]pdqabV.g*"~ՠlPr~т@ oӌwlk")etMwwt T}@tVjz\%)RA{ϰѕy0;o|p D(Ø̕LiD1c.O"d*.Y*Wrrsw7+10h4qwU5QD +l~XmL2#+{#ڞuJ7*1l~I҃w@bS5CMBKj#LvV4H%aivLIE)$}*3Eն&Ia:A8ےxWR-J9KŦ`\/wVխѦL1:= Pڥ@ajj4$%ԝK*ylwKD[b8$!w54QM$xp'a{5,5 k>=~GZ9b{"I2텢pljSRqǎ.vƍx5hG<1@];IwMSw=$݈[[(m-wV xSX%l`/j`$DTǖ]]8zx-6}g[Y)ɸ+`X:wn9ݾ@lX^L'5Pϒ*h]'+6.k}%8/0tSmQZ7e+Gۋ(8Gm_y+8mOu1^ ك7Bn#^b(hN 8n'2<qpMܑ{E <Ȃ(^fvg,UK8tQEOۯbN> 0eH_yX-]N}LF|" YWP&Z:]Z=ߤ@̣l坏23'i6!g&gi?pv!lGp4gglv)~u~>~bKbٺg6lSѶM~ r{OZp:Ql=Տl΀ ^3@`P"p7Ay1E[K}Z %Zr׹uƟ148k < 9W.(^bW4XdZ3NX>xzz>#jj8ZxH1o~1s;_~2ƛpی,o&-o~?q$!۾7A#ꥇ?})Y빚B+he}=bW96X}$\]g -Z=xf$g$ٙKUiǻ]~h9(E!ife5W܈C "UbenGYltmVNP$XcIɛMySݺaf)ƌvH x-p!8f6JPQ!4X5]>-4:gZWLWH5㍠[`nKKR,~-*=R4i?Q9(Xcޯ MwRifzò)wzqlU轵*F3A8٦c?SskjT$(=tv':2W{~eS3,vΓ 3v?d?9UfbKp6+$ȳr vTR?sMBqu{?z5R-h_ ?Fqϟl:8:=˂m1 a1F Ar ^Vb1k~^:!Ւ6Η B(k|.0K*0ԖK ;4Z$uyy!1u0 @B#d7Δ8 7O 3<]Hrbq6+zlm3!ѱd>%X<*=C, n4]7z\K)gv<Aԓ׍Xd뒰klB?G:tE;Y{ pQ-s;y ]gv! 2DWvsV`zP4  tPc@R`]dRف|?>QzRuᚂkppJ C8ꐐȈ\)aݹ\,F4tN{|ل|ycUe<=lY\M|6f?l<=:<+/z3QVA6slKG #p|r;*7j4&cEG'&ނ_NM;8.>2jUO6ǠsDrJUQ+tΙ2XڏxlzH<"sbGpD98H+90):'t 0J H E0nfG?FM@+ў[AJ+ bI%mݪtVVs7Q=n኱ZAtmYDևB XshpFl(Ն4E%:q^ʓILhLa I;:WKL__ rаb^MEZ~C?ɧo3d0OS> Vɠ]nOK& ؃SrBml?6<v\dIϑӤcDB\X ~ra&j(y!~SDsVPMnbh5*~Ž7X)68-ON(/r.AK[{MΗ I@:s,ixJ{sC2nCI▛ juJYE a`j) mT fؔ@]S%8N2ZW (y\ Xg(5nb4jqF9t+l0}1WXrD h߹w&CݚDž^s)pQ.`DLziB0$)ô(l ; b֜D _q5_晒#ѽ.wGJNxSBGٗpbzUd~N 89͹y"zs6;Ί~bXFQ:q?ЗXXy:km゚U% sޞuU+,&4s?a.UHр er\4* g# 4M18[*hܱ:D'inPpNzx# v 9S&F_R<; [,.Y[Bc)1AJBDaJqhR}&uIBp0 BR:,+qJ(ҤJI }'Kɲ^q,YepiշH}M*E:c,K"ASydߜ>FP})sSJ;))N7`nB낿'7/P:p,BRoh]L3 &NVD_F9WҲ<|_SG0!XUѣrF  pFXR;acjHaHJm !F $,]P0$.$0NJu,`MI_xkE;^Kʐs>DuEDžE4ܝpt &E5SFgT=ۈo +f_f [$QoM)x4vO(IIpwa044^Jg}$#b>cq"b0c::gBf}g|Ӝ1 w1r$b(d׀cuzR6H]A㻟-Rr3_qXl BIx6\ ƻ|<9RpSgW؃ْ < MQlS޺dQ.vySLlv A9-MK:?M%(\ Kq,!҃ص4A%K9 0&wHV[LcHm@܎@o'].5B\9EPكB+&+(Ddվ/(|Y)ʶ$AAk6R $;fPˉ6xgU*op: 54(tQ[çUZ8$7YƯ=}Btc Q`#}Btơ+*Vm>!'D7%&EQ|\jamB횜O.jX Wy-޵$_Eٙy ~ڝ vB^-Oe%sQ$.nY*ذ,Ẍ8'2#@V_l KN _ɶS@O5^oQ9?uʓ k3'H@I- `0bmu/ +W*̑ ERlIM(Q5‘dSNDp/ J9x:̠{Xt6Z9Z`PMIE~lV-ktsGEuG߲Q)ƴ:}ZM>X ȬmXIjSh%yԏ&W몌^ShW?p| -=2-CCR݈9a FVJ:TK_˻wsXu7(I$[&}yvvv>/2o$Qӧ{$u1 ϋH9 lKW8fk7e+7}="#cJ9lnGF<-sHŸ*{Sa1fS1b8UQh4!~$9w^ڤtaf-τm7TJY# h] X^m6=VK?BZC3@T r ~$8w1= 067be}UFl:Y9v5Ǯ6}X?~GLX:fN z753G{0|87Y oۄvVˆ{q AM֠㰇nv3{ykNխ#Z w!s]|2wiq,V7}҆$ॹ nn^_<?%G_gx?şp}i/VF!R?s/t-/OOq}g%2t fK9FSOLNr{B]6\_䤚yTd0JZ c(eV%u‹lri*.}3 k@leV4(Ti"yG %T٧5%gU8uNLETsVG"Ty k|qsC5 #V6]VǤ眝98Gd(o| ǂA6oBN?\G65f]bo,cHT`Lp:J(yEЛG[Iʑ)aXQi>^ =Eki 4 (m_}1ai.8ۛq[e֓dYd5IOp0ߋHt'7f3۫|11M G Hަ I+!۠J%3=QqK@yg_VBNئT֬eֹEP2[)d kJ2M;ijtwLmKW8fjuWR;WzbW]MQDhq؜ś.6UB+Xvn|;weCƕP*T]\BfЮjtQ.}\*U֪]lKaO]7譁% @/VD в '4CpVHuM =@t 18 p҄L\I9Sbw!ޟOzGyw^*J|zStv۸U&oOE2ױGpi|٪\Sw 򤄠Hߢ -I < _mu!k({ɽCE],5Zܩ٥<*ެ7I~n9+7MӠS-Py 88  m9\zO܃ `euXwnH H` 9AXt>Zm-h ^uvOGGK^iznpKg=hIcХ^? b~dEׇ>?vm7ۘW7{]ˋˋ ㋋wojhzTPov'`wNkOϵ]]Sҧazwbye1r۽qw8e`YwnZ?\_@3pS0A梖z#4W_OO3eȣ&zkԉQ%VS'[#WβigÎ3;Rl֡R@3=,17J.a+7SwHV<*g9cs:R>~*1Wc{)u߮ݛ÷_sSc4m+QWǧ<>y9IXPo/y1x(.Ma~d;A4 |B2䢶o<` ů)`ŦW<+Skc/8%{lookCpn;]`3egLCy(ڙފq!@|iVSmaaYDC,2:ͳ5*~OQMeݵўkNkpL\YhZv0ѐt>[}My\e8G٤PbKLzs?|RX6Fn!Z+7^BP:AC*ܟܩtŨh'Logx'~ON: j5B9jOmm3Ӿ \s6H׋[ 8g捉1kKPBh:,8μ11oLlo KaYs ֭V9iȧFXdSOa p{lztRەJTii8>9=_k\*:Ǹ|zR}hjHFj8Z ]tUc U'TRYi _@)O `w XXz^_rષlJ8VވqXQ22 3:a l뜲E΁ȶQƮFмϡ&K ?I(_j" ҕ((ljb;E%ʿNH[7Yi*Ey#" ZnFc$j Oa%kLcH J0hJ Cc/Qk5#%jzeKϦW~9C/O6?z.$m:xQBA}c[\,,'/GVd5Rc73&^8`Pf.qS}â)wQJh4ub\}չVJk*2,!X4hQܮ@%/$6:q %'аg] 5AN !gmBsJ Ԯ~b@ª\_ %ur^r¬2~V-y8^9\SsK6>ݟeM%ҸZ|'0B񦔂J51J`S Q量u"!{cSber AW$9T= N5a@|}`pO!EE1nDX*H)Ϡnq5aUDZ}RX# Ul#:kc~7aW}%y PSmɯĹjhvHևRpUWZuuvB X`eW[^z#%;־9"`?oۿ=گ+;wYQח_6PQYL@;吚_IȠedRCEU ' yZhf Pʧߢ7n IMgaUTHTY8P/Ծ7vr"=' J k \^MM^lRePDiC=4e0*Y2'tǏDs} b>\>lHXe_g6G%>BFK9-щ-4i Dvw zJ1Jh>ĵH٪ԩ!̑0(xϚ0cDdq76r ~ ̓|}Old>r͡)٭-eݫP<^Q-m7C1 e٬ym5F^<n88"6/`t6l l0=`53>_!G"s쟭oIA.+3 ֡>KIX R, dWyEo+3Ŝ_}^`}#f>hړdOi>k|&Տg׫|ZMXq 'OG/eݬG0KΔIP~>&,њ?7Ząs\0.h  *Bh-4\Y5GjunaŷXmcx_ s 5ӕZ|v~aWrsͧw#o](B:[p{ruSЂ-J۹u2,s&ǽ73Y'4do>VW~[(k9>hJQP4'եVܬ采 Y{x}jr3A?Eqy7W_~+q*>@=b&z)w['Vʻf)^8GԥHIvBj(%׊b[_Q"4@R#)%\0 W[o  jI9Z{,ą ԖCVΩT m37z(e^Bm':/vAv?9c5!;>(U "7ӌ|/=4#69/ j:|;T {}sLwi&G +E)[f݌*ܲ!n%It-dCH9Wb/REuHR)9TnRt- DstPM]喳~TKÍ^'$K^duѻk߱m6ϣj T PXqRsR!/r#`)m٠(j4kP=5R $@.} ]P.tHcm,BGMNCk^HbD ;FIg$ v*T/k_ E Z:l_F\LR &)7ԋB`=dRf6r}&o`ˠMEYuMmQX8p"5*Q'j0_3C}0UaY|څ3,tϐۋm9JJOٶ%xWշYdOryl¡p_A60pvUGaxW#e|CpPUԠ+Àuj=JWJ59d_ŤR2 )0Teҥ(%e)T^IM#,,=31_Gԙ3s|&ڌ2B/cPa3MalƩBC%~4-)(@WaYio aX2=k^L]fp'auN JJ 7ok!L<՜īOFA=yȺXbeB_mw`Kq[$Wrl'K<1BɍY{b!DY/6oiYӇcCP8yl BGo|=rNp3#Y9 ꧋l;0kpqLS0l(\ 0:]mjpq5 "mw^Y׭Gʰuw0*\ C7gr6X!i۟V\L g!ߓy+㰊,\p.9d%N{`C˭DRIfg3tHUs`A!ƥA{J;gC8J`(q a'A @IҲO>zG4ƅF}ll[(Yr+&5FX'&uˉIQZm G9$]ǹgxYg)ӶolOyEl;-N{,c/',iYZ-mhicFK3Zژ6&fNCe'nnOBxۻŸ=N) n^M} < 00Zo%W]_"̤jl.@.f7JVٙu(҄mK+5,Uӧz 3ےI-ܽ#AӬ6]jv50N~ِqmQu&Gljg Z`rNRաWQH 8& y(IK(t<@cԷMҨ#TdQOל9F*]:XJep3e^@,F{ R0䊕mDY2fM@O33sz\sXVNĢgWK^^A ч)F:jh]YsIr+ ڭ#JEiDŽNu9VY @РxHfWVV-7J)"=FWTQyu<2;Jnw7w<*YB0YDbE#2! ?1,QA΃#`o׏vɋѡj4͵h?ۏOPxF|Zc Eal6ϵT08QCLr[ xPaiE_O`-F/ͱ~68E/T}n ZTapy%4zTdT%JּƲ駖iVy!oߩ(}kCNa[B0dt\63z5T2hixM&5އ ,@CDrhlK(a45R#7]6 M(yR8ks>w5 0w}c4D$8^WDSb!,Wk^W۱.#FaF{,3 +jL)e=|g캃 uuB<$`E(-b0=Z|&ȵ)9J`$'Z潕teüY?S)e:wQ6yJʗP1NA;fG!X+olʽf0.xyztL{d;ӽuMvoi*;Cbr -|q*_۞H͞H}<:σւU7x?n>X Th; ͺڽd\AoR[*cO * aW9}]5=S!^AfY럗VϿ)$8S_w./7JR jCT'^TOP"uWHK!nM]>m9ڧ1[ P V 3T+]mF"(eBsLFH V(F2as5ZD})qPKB[ Ƹֶ>+70l*씕}TCG8?{Rqr۬͵?x|~L~ j!,s]C&||tD)Rήlg( ڴ#GWYq䑪N,Nևcp"ˑ7u$ּGW}`̏;[rP3?bG:9fyAm;TP,67  p)  s.9AEUJZ $b}s5T[\ѕ񋧰}SSZ k<زkz"a DQ>IakLE+Ud{ %F*9sZj{R[:fݮy1/,zY?6D9(LìY,s;F{& 9Ny(}fa xgц$!q].Ĭ UWO/{i/H4N}oU.grjqUc6ѷrEȑg!L$F݈A'13 /׍+J55 mkwC?+0ssXtd$r,_J:E5o,Vz\ 6tX}6+ +1R΢{lE @WS"r`b/RUShGs Oa[ה cRQ~40){D`Wze&"w:aS*ސuSB0XRQ'TjP+D[1vW ^Z9 lR3EWoUEq%'iOXƿД?D|v]4ɰן !إu ?'ĴĜZ^hȱU Xz3Nl8ؒm?c o34G @ )vՌEaԌּqkkh 2j[3~G=s@o-8υ+UABf{aF{bTO"USu ycP!JR%>$_c 4"әѧA4@Tb t!ߘg}lyMF_u5*Ȉs:l}9T!zg|zdNGQ6cJ4ohM`H2c t2"ޕq$d޼φd3}t(QKR !1Slð%rUgr. 7!;2Բ"\K1Al\tEwX+@Q_,ۈMDWW7U\8w`]ks ya3 }atQ} ]Utd1L0Z zm DxZ,X`ݨx2[Wˮ&)}L9cL@} /TL7+f',_<4,]f7Ja*ifܛƗ͂S1V_g=G{e6.C>(ԗk:6œr,n!j=ۣi{4Wk^1x)Ŵ38 A*;dWUwhmboMHE(&N)Uߖb  BzG1uyMT\~|{lIB,JpP.e)QB.-H!UlS 躎.To?a}=;R]~h$5ySpISn;{5%[4U JβSVi%J{V{}a':Pk{z@-=@T˷wB 18X~g 3eiaBn cDVvܾqLjRQ CHP{Ζͧ6C&+h@ =AbO j6$Hqz- xOI]KU,6^ P lq8OXxzLLT)-䞢iYR1W缘"A8f[bb϶R_NK.Joc%[U} ?Eiҽuv EpQYHn#g 1㽦tNnxÔFI<9O5;O5:o#9ॆd x&J1 |Ju,\a ! (cy~ t풡C'Z0m' TCxJwj<{F+n{}ODx!p PY CqOy~IEvΝ֔hʹIA1?c9ZhϽKF /؞۵٘璲kdd\b0@%d1buz%dr(㧲A8]XG>}_!%*&:RlB\&rUw<!l}d?L7DWo߃@wzYTxtn]#9_!Z pR ؅(05ED+ïỸ߳!1Saə$>еLàskֆ_r, 麅;pA2YLWra*49Q&|{B,oqOX K6Dy>h q?"M1JD$V$,hw4B@1ĩ~_ntu.1 +"^!`T\R+.Vo Jnd%.{-i]<^ Iq TtZ_#봊€%&mgNpM{"+).b $8haAH%J4Bgf׭çzVF jQMAD)=ނ'7 v-*Ҍ~ʭˑvK{'pF]zV(BǬht8>h6\V.޹/=${YU~n(/n(/ˋ&0 Hf`)(#LygbO !H"eάK90e6la{NG+!&HD+e3~ H F>BCT%aRriS '2DLRp#7#aZfEXyQMJSL:!dJ M197~=jRxvOI=W$gn(!#$J_>ۅ)_SM<|W=<Q6MHḻ1!IWK-P3}.q&XyUM"ϔ>cͣdw `~zN,q؍tԤhi;59]enXenyJN3ЉBA13E3z,F؀~vB'1UCA eSM_(5[K3H*~LÙm k7c$,VP571<3k5ʢp8K21:2+<7Pj V7'/ !8e%+iLHi%x $TG`Q|X3 JR-j܌ eCg<o q\u \{ivĸ ]wA貛TiԼM(u>C18˸'@$Xq`jaƵ0ޥnl @۔l0gv"Fu3g^ |cPX${X0C`Bx-N-{hj{`E+ pil>?/#PQ-y]g8LƉc ,9Uti4MM"|>rbW)'E3xEм ocW!?>$ boh뎖|ԍlOײftw $}H~E,=7ַKc-N[Z9 >;P؅dnY)aR^Ģ|*$w)&zƁF;C@T  qh.OT9r4 |^TxSVRzuȡ`P*);7 App.A(LV}^1n'یM3P ޿]BZ*sng~ׯn&8n;4;nRX^~]pU% Ϟqo]˗ZmbAXX#߿a:ZxI\OG)S:fOQvgxq1P%L"WTiM굣"x*:Yd~iKn]S [4M3(67.kT)jڌV|?mFVfdؔ}6BN0A.ϺAၔydv4lOÕz4丂1yͥv& e,Q?𩖔ofJ @)I/2Ղ>P[%w#ZU%ιdfꙇ8'y>D/PigK1 hoV }gTۧ>oBszm:J6'%f8up`˪ֹ'X:ԲP3Wcɐ:DYZ$x3d8G6hWJDkg8O4&,A2,cƥ^'KOhJ)N`8;KkWvE˝$ 8-{^^5^AjGԜy%c]e<()W}tv؉w}uQ`ALRu~klZk u!*h1t b=p:,,zOuyG$' 0遜[oj6-n' hzː_;%?,T]&:,:jej*!h؞L{J/F]UCqhSvU.S.ɨyun{s >>y3,9$3eTEd0 ̐ uƍ$Lu^ \ ifj#v*@^v|3L>5-CT&|dGT/4cI9{r4R/u޵8'XĆW E*$e6߷zF)pH8693n] ʁbH⭣TuA[>J 8OGĎސxo) -WsfAw[`~k!#ki ʣT_`#؂dQkEHa pb E~` F@i 3aq"JV"]=tyznRp*αd1"Ja k0}eZSS3 LK$pj6_kZ㻌)nEο@" z7J|@\|*,z!Sc98N~YOd5_9pvq- Zd-mrcч8 v;I=/ Rb) Z l|PBBjܶ cӿ Lx'Lޜg\PD:hpJ4U%F|j>h`9\^PG&J;ŵW %lzM~o w& 4Lo^dRC0,W헺ݕY;7R~s:uZKyE4."_9jF Ws!s eD; e&jw]MH-?NR7B\9pFS #Mw~}Iт. kq85FzHkӟQoX`Di.` sDEX8/U S 3G׈a=u.y|HC \( `:cR3j/-=ksP/:ʩI /+wi`3fS4ܦj)8Xӝ^?"KPB20Q_|o$8+ޱ"" x"G5"dwbA`5Cky(im)ɮO$wmk-cWOO$9J6 Ĝ0Q4&X %pw#epKhvG"Ba` cBKϰ<-8S! Q45O^3pk"F])ls^鬀™9<6ӆTToEpȁ3}`Faű1Xe oA IvR̄J3;K1vЯU.2 d`HNy ܎TZ/'))X2 kqD{\XM J-8R [1kFp[CA33^K |xs)"gjg1"X3u+RV(zE6,_f7~0m>FC3a dJ[iWtdU66iVk35ѓ{eK˘?k Q ˯/_ZzZ/ȋfjt<)z?pbNϞ>ъ^_ΣdJ>xOSPBO7}3<깿N1{S_[j%ᗿ>S ns6S'P4YW5cehF`y^й7&BIk%% ĔSR"x<IrCKJDIc4qIZ)]_"Qpm}uj uB] 56QC52;JiVѻ!s[GVo!NRo:jB o&l-ChWm4A; 6cI5{v(vNVH!}ਖ,gNބS3?ҧN'g1͒ai+OIdL+ή&Ը)Gm9Q*AkpaU QR^֥-7+DQ,Ђg҂DFv( NiVJԫ*F$QfUuR7DXn>#v\/= t%`ovOl{ȃ0c|k2.a`F? ?  Ypcnf6v~hW_~y?? .,Nƿ{?OY:83=I䍀*L'-ˢ? v54޵B`嶙ܿ ;p#1(#-~=K궂WSJ"\3yN-eD~D|s6E)Ħ[㬇wӦHEJߝ*]G4UwKmRI?q8÷&ޒMl(|ߛ4^Wm R(Bsoa"F֣k18.cZfXB `֣k38UcE(-`SWoS%Jn5?Ǡ8!Jou ~0>>VhPeXb,V0ܸpa4Մl4h2]0B}*P,4< @[淧WX? TMS [Zx<3.LJk)VȇLVجۂMߥXܛ{&su;EU[S{Q8'GW)Q:U|ו.9CI*ou({d0O`Χsa; k աH$JxRJ4taUD؞JF#™2]>tv ics4~`zXKtlAo$JoEc *pI4׳a60QQsy}2|z4\G'W?8So=x==t뷃p7J|3> F5wڛ\{ ?d|靦ήƷ)͠@tsEƿ?wALu/p-/Mg椴ú^v`Ѹ1M\\Ճ7-ҟDOy٫J>pXSp|h5bJD̶{˫ofKNWކioP1U}x4W$Lɛda>w?@+a)td~dz/(^zA_YǸ7@#NFe:Y]p8XRcP3Y 벍gU/+l򞕏~$MwJOz` ؛bt8_v(2e+oATB"},^B?~?};|9<bzUXrW߿ޔ`cƽ/&k~XʑMm*չ.ED1>^z|Iy=,zWL] x*9,sι۹Sm 3K}XJX/H6QKKQKVfLۜ{:zggm<[j r@&ݓe ]W|wϻ'7X;g)޽[ۄL9ZrCӟy @OW5\HJ{^O_LG3YrH-Up[hҋ=2ܖFPq$N-7tݖr[}ww[nmY\ߓV-rH`LFMᅶ[qmiE|,3W1ϥ/-hfoK )ޖےy=9ޖے{[-9& [ҥlUBy F/4ӸLQMVGfKj>kKsiKKr6)ᒱBoYNM!B*{M醽a8m;N5t6zsiMX 3xvI„?iyfrmk32*\иȀVG&(8rP|3Qcf^sMU,VmR k"e`HOcB(h āѠT0xځcx!i:;Ÿ'KuđY˿"ˠgV oIz12Eo;G73%*K%U*I \b aBӂ[T4\y!>·9DT>)y`7:h){ 9鶄]Ahܳ%$G"G"'mT)t <\%CR7})`ږҶaM%)C2lAE/eG7 (RG?352zj@r QaMCv#F1u(G33L(>3 3l@GFng 8 60>/kr60k,[NmJJm"JL (ӟ;c 0ivep^q'esFR&*= 59iݸ{ZZ8a2/ JQ6DA_irhb=ܓfN5#:|$:U9 n\70L%? &+VDeC. 4pNLMLl䖶 k*Ș(wzfWqCvZ_}a&{nc@*6Iuə&FFlUR6ꊣ5AM.>hvh Rڨ.%%'fjƀvli p[IJZiePГC=8:߾|X%#,ֽI.)MCl# U{<;9klrHnHdDwmfL/^}V\^UcCZnp H28jy1Xכ#ŧ: <زFXށ['(-Byܦ?ɋz`@]D/jMU|.Μq g.jNqf[;E*N|2'Oq`x"3t+->ݜ\%=<^Vni*xgݕvrѹ F_-8*Y-91#% r#HimiC `}?hB8Up5zR|j# %91mg#Ok?C%!!_T5ڭ4H*|GSWٙhreghLmDsOi 'QKwJ`K R \\&5} nC"0z!.wVtVSZNOfj8tLc:8Iz> R)F 0c NPJ D!4fvq\';VPڨfwwD_9W9T絞݆BKjZz`I[|wYLaLpu2leԦ)UXkX1XeOzeI{D{+Kjwz`I k|Yʣn}gǠק9za7pf˘ zj^Sͱ4Qd)׻awLm"mԿL^%s}sios^̯ʋm㼈GӃ =[A.st -rZȞuMp TT o鼴S?w+OB^eol(tƱ AGZP.3S\#0߮G1JOs˧J6L`5CZyʺUX$ԌǑbzr^؟ĊO u)qUuُLHGJhek+/tݚr?=[JY&hi"9#'X[D샵U'IffN:0 ՁHZ@7}1޵_ ^_޿X.DvF1CLjSF۹p?6$Pd?IBī>h-hEZVWٹ}J堭jPlRfU+"I0N +l(4DT ҃sb,@2FN΅ 9EO;ѵ\f@O֟ %eᏲLRPZR\ )JЬE#tYtQ☕PKE^6pQZyQc(o! P%ʸ7e]Z,̠-Q\){cDw9+WPsZ|v4H}Gv3jSVKaA؎ E4K8y_c_9[. RD'mSGnوj>$ RǹL--)6Q7CklDS[hgjy PQ/VH;)^awr6'~R~N&G(?[^»{;wN]No51Rc •G,!j܈@WP+\(A]ⲒKjSkHGPT=']y/, b*ed;/`vg-xf\vi6Z+cD`ej$(lz׮9ۘ 2,xT%EyaTḅ#(,Et@ )Vb±֞&r Ux*zVPgǔuvl$2O3!!U"jƸ&~91܎Ac,[hN|7*5:Pp" 658S. ԂR(vv`G\eWӻ7׍Ob*E^ByzoNbkWynx*PZ+j,%^,B.PV3SN ۟^X6x}S{´=!]Υ]ۤ[~g֏7WW| Ȋ8?QfRJeP:!}[26T9)-p^;$&h\ !n*HLJ"vBNf]r9:v&#N!䫉(ef̠E46gH,'Mbb(.[6Ƌ]ӉgLHI@?,VE5Apկ;_BHV^bة(}a HQ:--Di5LthM>(WPd ޜ1(8GżߜMmXՇ`/W>u98ȚjCC,[,$HJVkF$;_s%po+pMV>N*j|/M^1U?<8% E5DJ-=eQ@_Nk.WqT! 8 1E̮iEFj91TT{qPTF. %Z%3u\8P)U".P UgJ8Bb ;L2vdHY}_5f,+8bC\ÕPg#X(ޥo{~H-`T\` ZQ`AM *FA$yi8bav2 d& 7!o|$Km>߂?U`yaqat Ѣ"(`$̾\T*'i0 Kȍ(AQ N^P}1Qw:BpU jPup`ԩU Jdv $cRpN*F7kQcyZ@ylha̹r _w1df.Kx N1|w3ϙḇtg;`2J2GRS ʱ^*MVj5eeF!gZ[X%(gHq8׉}d.?D\,Sכ^څ|v,>%nbRg+b7r&x7b.|ŭuc-#t)/{ 0v7lSz_SE0jZQ{f*npkņ6Oƍٲ n׆A(8f,% 23ΰ/{!yN2 [ V }$Q#+Y-k%'1+rTN"\%B -NQd\f663^ԕ;+^=XJu`-# 'b jkKX&^Pȕ\?j ɴyz7c .si7GpV0K m:\מW ൖ(8Yq:r+G`c3tzxq;#zX?%nl&mw?@\}Z) ga܇/+19] |ˀ3,˞fx ZbbI-*bbpj= ;H q2/In8Z sh +=.(CD N'݉& TJ}HX\[96&cϽQ}@A 磴PyxݽԐkҸ,]ᯧJ V!jֿH%D* 3ꐃt;6>d¸$) 66HoBVH/Gb,~AʛY,fK%( 1F7Xas dV$[u01JkP$jBȗE, qN2) /+A!F4C"fHRTNΫ {'9 voGJ܇v`&K!V~7|r ,`^X%_ׄ}'G[v__ r9~;}Lב[gVi`{ TW* ɫ B$BBTJPeյ/͐vLs vTWe@hމTBϼm*0`JSr 7!6%k8 :#5y>H +Sv4CV ypv55g'Ӫ3Iu=5@q#WЁVB:M@P' `Ň,*3AISav S5B2::GTS62 DkTg/4G5})54Ӛ%^FlgpӔj8WeEe(Zo"gPys-YS"h,n=f}Ne,Aww/|+uOml^]y$Pe ]RDcZGHJKk#ZVd!h_Çwzs-.~M{QlBч<68L늼gm+Kf%$2B=E"(9OQ!  1n!qF3 d5\L5WH7dFvO|df'M(gG~LIԶ_jW#Jyr㺕$UK4nC>~Z֨&gT=mztPdv<İ:؟:B`dGwc7-*h [Du5[t ȼƉ"Hq~i]5p3@D͌!OpKj\`CkX떦~T^p[a}4fJFW[NaDfaB~ 1U LXˉW/_ci=<Xjݔ{$îvo^^}=a2}`NgusWǿ&ԉ?}׽ثH$$XS7\kd& 0GWOǓhק`{h{O K3w-FƓ^2'.<ռw?;zGwgnsowxOwnm:Ǐ?^d>D'e>]\3T8KDJNeRɬ RafXPqXz N $|ziy/dqfm b 0z\yh S3bOOssw2<R]j2[<óGϊ~1WIN?"aǐ߾ k\ﷹRxRKy}5OtoNZш)FƄXl=A:xI-L1yn`'[˱SΪ;Tg6N5`Έ,K׻xQ3P/O:*8AZkɹ# gj tS4zOB3Asɸ=O* ZDcT&' Ā _j^~}C}F 1-ze6VPrM7%﹒JHM S: *G/ 2n'l"7t07U@l6rf ]s7to><}kQ'z˿:g\IoZb'rzDINxR!?&ۋD~f/mW#T"?Y.T2aXj%:O/4qHācn-ؠ SNh7¯6]y߇?s\"e†ݯHC~]V dbF4s-a' ~+cQ`5g`@R:)6P660z=6F"sI <Y`zroECY^_YV"LK@+̔aMY\HO-n'.DSN!еM0s&U-oMwy7xh:C~[c>jƭCY>[c)\->q"8WLf˿Ta-!L7lG.?,WqXel?*kmz}!pe}icS?>c@YLfov0ؼ(ԦZ̞VefUE ?X~7o~PN5qޮ; g _kw; DZLJh1$ڳZc7O Йil:mО/f.f~q$gt8mz;?Cc}գ_U259#N7B لMIA q;zrEڔ"npCkM|}$wt"gWѪ#0Yt{6 ƅw`⽎3c(?LVn~ܼ#ȾPۯ[ڙ _7`z6Lg.mvGM?*嫜gj /N8<{Ssώij:s*{oa28gЙtL\ 7QF@?rwu4o{y9\HiةžsxyV3ؔd$uX=GjiLj%1zWa<+"J诡>,)HoGKJ/YNKx0zdji*gKzo: cs.-Q?~Zq?3k] dt: "2>؀fk@6 [H+pgv!UҀX+TK٬v{VG He/oN0#UV(Oj`9pc >25]ɷˎz{AfNL 'rEk^Z ``2#>F(t|NQV)QuV$s$BMQ_jeM*FE^37?jlQvpݘuR#($iY-7KhG5v֠Qnu<ĭ.4-:l(_:&^_ =BR biӓEBi);?m~9RKÍ I?%ֈf|22T`j^߬4amab)KpYzRqnw;h[$pJwSs|py-s/=]./LDs}j Ōsw&6ifR;(%>u&LW>X{'!y \No**pg٠&y"d”FWէZcÜPy|jٛ5}AȄ'L`Kbs /%2sau:;yD B57dۙ4-u$.Mq V,R:&FpNkBJqW|B> Tdϕ:AF-1Ƿ^HA+(,m@w EñHҷAtts=wJrw؊cԉa!,A!2Pɑ"UXSZG8W,vrze@ensq6_ݬӰ }-=+[ʍSэ|"F~Ƒv4*ڭ-|DgnݚCkքj.$;V2%~LrqU[KxNϨݮgšRo[p֖Poց|"FT4.$+j<QG]/kJIڭ5ڭ ELϒ(E=b~JQAi x] t||QeJNT/GS^S^8u9)NaǩSzqY?u2L~>zuN|`ެe~6+NI-W mm&ekT>!>#+4oM%1-y';7Y71&F#"nq SLK/I$I)Ơ्6qK]E.W3M[l z~?N)t9k?l}y޵PR_+rdeV\d'(DVL4@a-]xgɿlo~}O>d v/6gyry?r{75ӻfۏp$qoG_>Mn*9(]r̽_IH0rw ٿpqS*Ha1HT* o79nŭc,_ pe-) w0a$0pwb3}G؈ L8Zm޺#n<ϦfrS(joAoTl[XW%ų_b^B~91k?:kbΡ6Þ L7w8r'3 ij܄33=|ĐZPY31~zv"i2@2J7kƦa re]^8Y>ҪEl1Fj0bV湛GgR|SJ~=PP(LkK(B;$ץ{SQӳ wJf zE<֙>y3hM5Vn0CZJ?D@ Db5:GFȸEef4iJLc%?اJϒ}8ImrV~ʌ30ˆ!Kd CwO̓Ol0ΐ֡HK}$}⺲9g+=PF ADv2OR:2NlUt35bU'?}ģ_\͋>a}8^*F/ρDR_ ROfq l 9nNwE+ܷ8$.R4BmC>ݽQ7(>Z1ш7 ?mshv {jIPJUgae6abv\n<~Y Xv9VNs3]LlNT"b|Z(ytcGoqRfsNo,Y:i7?\b.Y׎!qꛎ:D&v,׭`G3`k!;9d$'*̶MfΎĦZsAPna8a1C˩ ݩw!. m{e+Ӭ9Ih}$}C:6OHLw?p&\""P3|gA'bϚgL2ʂd:׽/Ƀs)9ܹ+ .^` !Z!}.,Obqcx1Ybft'3x A\PN~8O9a*dV~;YMX"}3?WU؜H!$|i\b-e*rr|ӚJs 3S)I9rEc+5 +ECQ|v5S4I߭,=ש!I= ?8fEmß"3IP^r>dyւ8εWMޞ3?yqtG:JkbD됾k 6ڃq^ё aRW&U[x9r"5>xܨ!1^J>S L>Nt КNYtrh!A͍SUgi2U#l xk%闋iN ThG);9X܂*0dmÈb8qY!m isk-q/5"ڙ}kLRCW;3* R T%J&C>=Ձhʝ*+Ru_x8]xˑFlۣzJܗ?`B}<)ha}w?Tr͑j U! s/&.m;=yHoرPUQ - \rv܏qilgxܡ^=^]WcBEPF\ J=?D.IҜfANĥusC/wQ}ƷCĔў6uUγYsL*.Uv4Y~wל `\*{.F~s G=VQF̣ $Buu/;]Bm #`fIS&a%ӌy xlhe@v%8SEOt[TaQ$[Р4}[Kukh}zEu8kNf>ĶGTŘ,FFezz I,[qƱV# 87bAE/݁_V BKL՜[լ}_S& ;?p|^֠3Hu1R-`&$}s<`b4^3ɿ9,~Q6FGrxĬr[ :=,x5݇7q)_0a왼%XdT+nފDI1?.\3ub{%#0zʧfew-PK1Q| K;ܷ @ _M.WCx>cBuȬX~S  c"k8eRE +;'-M3@Hs{s/cn ԗP5 X2#Nbm6R6^,q7,H+sa-rʩPNV Bh>>|fR-p>})`ݱy:1F:Y)g>ΦpP# ;nXpwg|rgU NxgvHb%C J+{9FeÆG' O$VIݝ,5rBuKzm۽|7Gfjo*H!Y}wX44Щ֙Sy#Dbwvc\ Y][y\R5(` >jL蔻I B" PawIMĸ+46mXVE:9|*j6CHԾ=E5a ág 3/EIdHʖjF ڇS |5j$GVphX%ʕ-{9R OǦ%˩[e{1.4yeX@BHJNˀN7Gr&-'Y4kk6Xy.h?Lڂ]@=b%VгY<`0:0)'.Z,}谧:ɔdU]괃'3D{sJ|U`s[R"lSǟȁ(mRySU*hTPB5?Tx(+Rq1won}3;ܪ>2\*wcMlm)umzJҒRL~_=@A˛|BFQ pvv2a6[N'=e8xۯں#/2ܮR~~BhɧOibv2:I_9QLw^Y3`\)zrME|XD:|a?b8[El4Aw A-нӏȖKgғ~\> z݁Plvߎ+;@ϯrwZqk8wb^''tYbF[ :ݟl>Y>IL_!8plƺ8{;~zuvPO1P&[r{@,{ξ{j3pr5`%D;D·?ͳW䭙xF1[dvz[ g3^ȶL';N&㇟+r y.wJq&B.Ή\Z!@ўwx??zqMwͮ&*fL&Kʚg576#`l͍#1$ 'L0 YdqܟZF#/zh3e~e,h`nGBL3?*NFH4aA(R4INShxKN {7ܴ ͍EXo?3^8rȩX؅D& %@k@ArO0(4 ?KEOGq:e!Uɐ=> "c<(l86se2|>ZN閥dHQ% ج4`P"aJ%!?wy9cYb O=SPh鳘YV~wf(2غq8Du1Jz1k4 xw:pMw*q6Cj|)ڪz|/cnpi 3ڂV|m+p:EǗ):LǑLNٌ%mvj,fGG*ꙧUY-xCEy@Z> @ۅ҅iK&,2_jO*w'>@+|iqR m_5:d4;u!^J|Z9]8P{Z܄:p'Rq< - YxCTDd! OTrqVKZ]x6شh P/Ѝ{&-Zn/j uH<q_Na8_,b[a ReZK?۱ˡ^, N:]WWiaT0=z;s9Ks޶K{ό+>{_*NN]3og )ǁ8֋œ{maj, ?Q۸Nx`1;]baq繁gP"!ɍ}c%{b'\/&aiw umAWxQ~L;=0ey v u7+%63.ve -q&3Jrzyu/G"JcRAqBq A`m6<.fxRhr^~4rAiˍfѣJKIx$KGv}0\vhGl|UGM?|uaRbke=nV5eǗ7.P840jpEjxf%s% ?Ū~B]cjH3V B}f)6 RZ0ɉԯB֢#졳V_ 5O„sOm}=m-lP)%.D>x{ۦ[v4X`n+ҭO6{)/(=AҒuX ˝w25{RI:x:y僝_=θ^NhKjU7\->yfymnYkPk"nm)b`^}7Ӆkp!%UBan{ѿ;/}s}U~}8O-SEJ77.eZ$C)F(hHJYb&LH%,L(tt( /M\1$"„0J(IN $M1O$2h1w(CH6\aS&x/ 1̱ȵ~_dSj|EZ$^:"6pm3VG?`#Qp&R۪+Qj=Xw=EmK*Kzf}T7=m?g"ݼgLt'Se^~drZ0wiٌZ7T|:aޘ$1k;ԑ5sS4cJP! ֡Y`NoZMՕND,4 )mW p R48O$fH05k3f^5.olMNZ+>~j<}I*`bcK_G|ZGlYc6cfKbWp0"W$(Z<ђ'kAlLeg녍>RrE+ӏ]u1,EIqf]?%] %횪mwWY?m޵q3@* id6e^kR%'Zh=IB/?mO`Ch]~e+w*-{BQ3 gfmOaߚ>: Zڬxb>Ŝ;#?hvlhܾ  i?UK[*3r ˛8s MBϢJcrj2-p!oط#,ƒd򉽩"gX PEQw$HD,VpތD %|\}]8$݌ϛF`!N ^`Lժ"ժSW_p̞ii* YhE5r RhbeNe*IArJ֖!5ב\qä'njXs5=7oP)J4)vѯIxO(fJ4RX w2 w8TeI36ظ6V)<:,bq8Ւ&XG8M81+bTHH( PBY*Id苀3#)R=A 3U\LҀЈ@BE3V\2%vu8WIYTJ̢s(cj#UHX""NDH*JTъ|҈ZsnڟOC.j=%#M܆RIe~t:c77>c[^ٗ'c=>2l~>Y`|;Ȋڰ<,ٰZ1 /K&O}+#cO2ړz^2{ِUI䑎%SNcfXsbN 6*v@awm K378TNؤ;SVx[Lq]6~q2.@@<]P]bB'o?3 c8GCEB H @h'\sTPҥ*Ns50 e54g4I2+d:sr>fM &8'@5pܯ6OHfmuλیz7+]7g_ vJP.Iz s!cG9d=SݷPqB@>]vB^UT d/D3L4 фT%1%N| EV&̮T%4bUOe?ޕq#b0%Kf]X;/3RK2.iqvB0,2 Pxyu2rq8hQƠ0 J0WB0434(5aEocǔokB'Uf/mm}.   M %/]od$†pdQ,ls'c*]Ih9Fc1֫?5,yݩv*2N|1yp]4X/zPҝao&x%@-^5GWMOBK!lA" ^{8M,+&>WCԟN;UW]*Q DO[+ͯ}=ߍRsw3(n<;x?JA3(/뢟6?%*GO}u(UHYģ+Ӈ§Zic?N YDlU.BSPct>Zl)d~6πsuS9 ރZb; od6EtdI@`0P#E" ,R\*ki\F 7۽sKE e"0-o<)HkLei@mgn ÝsPQ.v_.Q4M* htN$aoRh&%<,u[qV2˭1pFw O1h+@Rx9q+w s4*1%~Svh}&Od ͽiDQ6>?gɡEH:J,3$ AEɵۂ>tM7xD0(>M>A d$\v<~E/G gup 8o3ųgTJ)Fd}dMOa˸SC2zck*2k1*/FZF|+=A2IW6&]M0RkWYD';uO ]Ҥ)hܦd3pt(F!4Dۅ;:LuҨ{8a`*Ou0/M9u9;NUrVJ8 ){NP'sD#kDS0s<.TL>5f_!}h@i)7YAsAUiB(YB7 4K^*hfOTt YtbKL3uYg6@`:sL)gFJ2FF$`20\Z9%uY\ C,]>tfI%< i,lɸYz|ƒTʰ+᫰eQ K^-յcH*.7I@wirI ^g=j(2/K9@1xן/ݢ;Y)圌g?vXgr6C`lZ{ Y6mypac1c K(P| w,R&6# DvʖWTѰ=ecʗ)_6|66[S(2hixHjb ^AbkRXYI-xrb+m|:xuYCpWֽ3Őj9 9E8`92THzª3MQN[iNu4\[S+CjX# K0\4r5R0 uDT۰@yDՁ6GSƴwOٗ5e)}eҡsd'N/KF+KMϏ?,~.} )n#nYݚO3K⮂?>ي38SqZ -+ TYDFhytj/~>EJ|߇>7v0Jce۰/h#;w 3"P1Ժ+Yk%ƷCEQb+z3"fFɫ8a4VpKWk^z%Aյ7;yI$vW7٥IșYE˱ BUEޗA&ڟz >#xqh1o\x\ Β8pԸx0Y18+4Bw%N2$QsՅ80+t|ON.5PS:~~Wq{VQUm}uO_Yeo*.삂_>:#j+rm)zUòHwo)q4s\@J0E+Cwr$nn cuKFcFNWZ 4 <84CaM`[) 0ɥRX v)kNVDLn?T\/j~a& f&JfG¶k̉f5qc2K<j$H@V1-"oU )rQI/h&LZ~Tr% v~5@`Jby*aUVR*Wu$ ULɰ3 [ƳCu!<3|0 0Ѥ3~Ne-N)y9 "Pty^o/Gv99U{Uϱ+!%l#ܰ(5lN,f{^%h6׷cKCNiL`,"-t*Zbp NtDĨ؀VfƀQjqP`L9IEvbrH*1 ZIQ;Tj3gd tȑ0e3-(TtQ%q _0 U#y?hti`PLkUd(6E;2Iufcf,3֙^okAfHpr;^'85ء3N5`K#I/,_a82 Nf^}Jo!%I܇}nq$Ѡc!?zYVN5G`/PAltW#:jt *0Yj2wX"1V"f"bLAE4}D9<ʒh]mIi%dN拕9/ٛO 46}Iѧʩ>]h7CΈFrLB 6rXa CLDW8p@'T%Dvߊۻ~nvC?}{xj_VABږ͒/ *G<% N- XK+:`gRm]返 z)X[Q]wvyZRz3./'^a`[uY~'4M-&'\V~vg~rqmz_dM:hT(}džq_UiZg 1+-esVzR%M?)yMҬ1mF{mӰrү% y*ZS a6<#h֭)}G6` dDo֭\ֆp-)V{+MH=VAꔾu0nŌZ64䅫:%$,cN9߮/I'-cZ.ٞw@3F !g\ʞ|s=g{qcԵ|*/ gRcIN^w ?,ߜ-$ڷ|}w \22ym b>gr9㯠7Z*P >8E z/H*xŒ3IĤFaHp3 {5ᄡZFGPyG3m.)6Q)(4fR6'@e11A9d:hLms\2&/e.X26<# ߦ " ֆ O8j,LrF`Aot"zaSy#ZDZ!?锉-B ԯ8'"[:DZ4(uz=Ai Q&6#Ŝ  aeRFw"҆ '4DRÌ'֯ mJL̐P4?,*n%QB2DDS ij} բS<7k M}b9ҵuo_@vEzRS}VU8\3 &:0 {X`e[1Q\(EMO,u%ѐJNg\ +j'hS*upq*E HM$))|U7l(eh#ᄑ|Z{`U [y"ϯD Yo$.6jlqqz2Owqo;gfn۽?;[~h#I oOO)`fWɟf>c2ʙ%گt=j4g>;ju Wp b0$4e^FXgj,'jF;8=B$RcPM *r N9FW3eհI,p ,\(x!t)liEQ < PJQq(JM9 +%x(J8Q:-a "7u+GĎ>fRMd"۷OXr'lTz]]ͅ9ݸK-WxLI]cv3n@>%'(A-LۮC~?ĬVӋj w0&vv] bix!φe8]ϳ7/˄R,8H(=C$ՙNfdB靉bsʀ:Sj9W3ER\q<闅" .v hkE)7bEBbHҜ"-9ZIû7J4Ʌ:!!1|Xܲ|e3&91g4oV1ј6D"҈"<T-27\tye*qt-)9`r X|v Eg𕿭7'U~y[r8́rCnf-26l;nCs75U~gH՟8%]՟P( Y( S,iw,wy[bqf4''}GvkX@?HNw{B1K\L?Ha֑^|siHڕ4^2ɨZˍ=oDu& p 럯vyH3[«A jJ$7:U0!:@=H7$Y:X43=DGBV U-~@ \,qĿ+DZ}B:΀c@Lji m wԜM*&&Vg Ǚ^XY ~ɩSNer*Ss*rq9!F.cJ' #ELtҏ`'J|$p:[.A+Tx6aAij^áz(wNC.' _YsuiӐe2KIV:OubZDS]0lU.}Q*<_;~_;<9#kބy8j  |V9gP:gsHj&"-E6zM@y_L,DyX`(Jqje]fq˰Z{snG`#tM8-~Y,&VW :xNkv.0^_]OR'L?Wo`"`Cb&kw5DeX.[\xJJ.^'% Za?TjI#TdzM SkN9j#ȌIgVY5Á!1Jv%ǰ>Il%3C4d3e]Xlch<;}oBZ<`dgv~W/Ζ+^ uDmO%*loCD'# _pu[vUKd~3AL +Nq7r%=Z%3(ޝ,"DWHX^<$=VuD/2 HTQRʆ"9 m8!wH$%qѫ]j$5f]u0q^^g( tA]ǟ_@a9g6Y5 !8`3,w3'#O}ү7| u0+@D ?V|%=JEu1KDg,Ii v:F)"8Ӓ%>_n"9;/ZދIBu$v^}u#dDɃH5s% 蕘+2t<+*,0Y=כၑ|r3 9\T>o?G}{W'ϯ7.CO_DD1M& 3S zbp}v8tzU0"[}{r~ ĢRN'gfn;Na0 p8y4S r0TbkpV[SCnAsǼKBRiBx+knXLW^IaVE=୭C4 8RZ bp&C "VJ#`'W#%N7,F1`P0J%+RX 3"i=鱂H1# +؎%{PH UQQx<4(+#%) ܄Dk9\s-L ģz%; 3B$DE<Y=0Њ"@ F*;PG &SԑTޕ59r藵"yytXҬccmKYY/vԑ5vE}Y P5qڌÑ"cNbmd[2&B)t=km;V9{ +lVZoʣ#ؒ`CW BPFnJUiqC᯺)Y%4 i/^yKٖIɋJbֵ^bfQۛŒY֊y~ZGצX@/O`i6}Z266^2HO`q0 KWlRmֿ8TۥO0>d^m &hJ1\0Lo-yzl=? u;d 66xg[iJ(H۪6tU$ҚV}T4CyOz[Bs0 [;71Hmp 4;s_@w?_KyA ffu{G4!~q{cάuj?Nu΍oc.@'6|WTS}3Ebh矩W[MYtTC3g؎R1l|;V䣭CXB> _Q3BM;r7k`3ai4 06KED;Zzg~VlOkQ9dKFqš;9rXlA:aD\pc;+/xi.BS-yj98B9ؘv {pĚ`h[%Mgj7۝gGMd;ˊBovu7 fξaZuJÙ0Ů8GUgiXY!8O-wd fhc'Bc;V4X.5.Hyq8^k\'_~Uϯ#IM.x@08#"$>(Te)VkYRw?Hؾ팞e͗Eyeu?)-~g5[ex؛OO->O|lGA#Bۻ[,nh89%"S-0?$I;G3yNNK$-aԪvP$C(k=`h60΅cb\`#~i{!k?}qrKtƥY2S)<Sx48.$o4 _dR3cd‘a,$ёDMf.'D%+I:ԫfZp11*a5:qr#wCh/Fkf._z&)LөHEj.:f}N .~ie;/h' !4-VRበAzCl["“ 1+E8Ҷ;nm=OvUpY\y>TDvveEBJyS۝iAtlhumshvJN@Ï: *L )}sbl9G + 58W=&:.$OcU'A^~1 ο媽ޯ{jv1@:") :u IXr<h79rXn0Fve yw_w7h:[o^uY2@]=}bn6-ΛY7VG?~uh.Z˟\2!-([4eP i^I)hflRfJ!>aV ؔ $\}$f^g0† /f6bdQ}:t7þf ypmJ1} C@ǥٕl7Ǵ>Enn*,A t4B-K_% JJcb˖:t\gT#Br8+=3%NQn̙(_e9}ya.L667k[&I "͝r`΢81,mx'66ю9P!:7t=J!0i~ȱ Bs5ɫPoJ);W ڂN:M00 V6npSFԺay2=Dc7ux26KZY}zPB@ޗ#kJFf7@kSW/r_Fc_iJT)מeW ^)t8a$ + } *T/oJ6?v;&1CFPG %i",EaTr0ĩD! $jYWړ(VC^ k9yovI&TlXFl)et Kll!ºYPI"=CL z7M=\{k٦En?DGz"!c_+lt3sOӯ[a&Y^]B./Ӝf cd54f/lFR~! թI^ǡݸ(4eWKq8Yn~]_F^EY|{C곹 a\5 gW̞ u=Z cYUiY=v UKu~@ij@IT`zҺ!31oC_]f -W7h8+`|yb&twKr.{< نҘ݄櫼,w-l)Ը=f1C_}_7_Gx\a0+`xo lY"X P?,,#O420?/s>wUk{NNeiSJc +Qߗ_ůEKi؝[Չ% eZoZ[v"1BC\&5}~)gI%M1O=GL6lRk.vB1ܴt9B6eu +LK.[< ɡ@C@ȺtJ!%&-(ѥS=LthYw%nTMgiκ fJ˹{) M$YD،Y)iP뤋 r<ecJ,‰T :QVU'?}mkTLvodjVa`dX:=)G@q"XTS8ڕVEz4Ĥt9z!bh4ʨ?xP7)&=œN*oѶNg5WzLЯ_.BakhSQ8ȗڜJ#sh 4YFtGiL.c6tk:ƚc5^G3@Sq y dd3"а$YZDp#dCGFeܴXjwEtPTt -upӘbDľMp*9F|N4z֌Jq+BXqqaQ}1Sec녣s{-l)BZD82ɶ5{&^@U'5[ ;b,\ 5״}H-gR X. T'Nm|@F7U)ٻHnWzJ)C<8 vOAב͌,l9˨RVR}乑5R合2ڐ)dc6H OYUg+/{[\98X` DKȒQ\0l1}M.MYcl`2.Ȅ$@@PQԡ@兠ShrvoxaCVdiEk-ZhEQ$͡ JicFQ-$&΂&pBmyÁ Ks81Iv޹6ZҮ!Vq~ oxYkULMh¾< Z;75`dj15S=ז \\ac _~c~NM'ħ˻ʮ~s?Il;=C/Yu5ꪦhl:M!ɥk+xese&%@nKWO^+g[`b&׌ߊn|Q" {tzd"eNNQ%f6z(F=9Fu>""Kren_qdIoYgm䈨ӌRuUfT\FnY&9(}9Qrg5#Ejkqlz8xNԐs|PK^{y[%5߫>£f|3?8{Qo?iCܛ %q4{Y;9^d G ěh+V~? wƐds~]wE4)c(j /J0eU9~ !S:7b&`&^כ w0iZr~@),ɻ w/g/6'Ni.jsVw_/_vZ#VfڔSI&'ol7iF|t +W\5(A݇FUs6# n_oyQjEUJ {v4aWVJ)@ȣ_z՜[;Z釻ǡw+~̰xl`]|ll\t5)(;ȫ{9-6ssA_#062hsddHEu#Dm,;,WN c<6F4)z.H^9ȿRv'D*TH#aTa xlxʬ%&ύ+$C@2dV&C.3^0iF+QrG:*1=ON60CEovh`Xt="u;iJ="? [_r!aRgx[Pb!m"ErCD]iNg#LqMmf r1ˆ׋%; "e/֢}ƩM'{<|WJ(OX~@ 9+Yr9FHdxsu| xSq-! ?̠W$@c9j A?|=wjO |u7˳~zN_/qg8tȲ0?"zʺMlஅQbu1rفxWחLJh6Zar" k(dC3g: %mƯApArU1fUFJ` m30)Nu>*tE`9}%V^4䧙*\M<_OSop">[HzgOv'c?δ4hO L0 z :1+tÙcO˟5˽Lj(rBƐJnb` ˮykg RmQOل7l%A,t">B]y)QX',rwg\|1I/hUwv!ӗ!hu|L=(\7{/tr&O"O92 ZY@rYՠMalݳ+rf 5_4&SVBjI98ǾC8[iwvf6uJTjvƧˀ}*h#DZݥ1yٺΕٕݝjYz UsáWiМKyS  d>ۼڤm\k16ck {'8~Hq=홄sT`\)L;B&,C>UnJ LY ߜSVo%kTWFI}2cPc̘Iř1?ÔSJ#gfL6 %w\Z謸v_|"0 Pm./n>i2m.8@#qoˀh|oMNǐA!dAǔŲs[/CFMrIԹC]qKh&yzX>k&0066ZҧdzfoCctՂ-j])+lcfhD40OOB~Az M?]pEw}-4 ,G57hr:J#wVdn rmLG<\\hjhG M |nLoZ"7,\7k6(1w3XWߋkC0rP(*ۮ+ý/l|8hΆpqw}IJů_ ڋhVΧz`;.M?]]hQ_ 3tD4.zU8uZFv+C&`{K1z#h3d$E}A2櫛t{YfjWuBcZH-'Ex-Q-L!YxཱྀHqɝOTJl%E t-H05oL}~3߳+饎%[i a(tt[=H @L{ƄGwhP2>Xfm yS" loߐ#<=hf92݆)C#nɂ~gLkCH<=ͳ<ÛJ7ZM;vaJ- PiB4mQ- tBZFa,`9\"֯J[!SjK%)O+qj<w]]mqE[u-ڭjB%zu"<5A$&8Ad~d`{Wr=yk9v60 L&i_%`:pɂtH0%rmC̄V[^G}cWx0ܐN$2̀Mّ7g^F9 0,X'>~?_~{6YGOObyw p[?hϭYxr $䭀:X,̐3ƔM JR+mH ;a v^{^<tjgE*Eu%)XEddDfPdwꒉT2ۡY#+)8@Eů.Fſ董E&@-ƀ򵌛$9q1A ̸y\!'(F05p)#md% ڽyw%##8j()DVZ#CqhKZa4ͧSM*ε&_lJf&$8d8KZ3W2P|PZ43*&Rj%ڑ 9thk=Hi@r;(vA*ލԴD$w&9Dk4œ 'I*4h;h*+Nh |ee\_@A'ªS:ɫ3DSQ *uSqcH%tЂs{73ATs}ۙ$WpH!"ں#`ȾzVSP4E C]PWiVK`F). @"O$5\0(d d0 R;2970> f4۪{!d P2w`=>E_4Dbg >6FmiaaxNkc*OguCYDpz4'l+梅✶>,C^cPj P' iPArKb2&Dx 2ƹ?J3DpH΄3 {*-5p5ʭndҚ@M5ojP7zo1kpFsdʈ҅qZ%aK"@b v<V<6Z g (D'"xg)F\2=gMzN(B%n|t3ڂ<4жV| Ơx*l#DiLE\A4چ-9<.JEV 3rT$@2=QJl*%Š}h }ӖRWSeP\Y&0)Z5S O] J-J$y-zOqB0'$2@I4ad=KBzT/Pd u*OЄCjr[ zy PAv`'9gfT]-4#27p3(7/C B LTm=ה77cLԕ=w.>|no݂|t0zn 坸6 !Hʑ%:MA^r"q`^FTC/zjtgQ[/KJle2l~5-.9\ do8=.Od%z?pїK!jnt+F2 ~gifuU3(9(KEJ6v}ޯ%!n9=sIBĕN!rbu@BRWI9i c57kcbHBM9[*tSE (# mm5fii/Y[ տ 1?ZF'ieEJ"r19 0BȑA}" )g=x{'&w5gsqTsq4P@_~Ns W<[INsNk~#|m}.Zb#rc9o_KF 5.uI+qIӥ|mF6APm{Q & ݍ0󤈷Q.#_˝Fu&<'|HTP#S$oc%M0,S\ihD҃ )yL>pJ%.P'B}Z{,K/`nSLNkixm; jV\QY,(:PJdpBcopf h h2'LnM C6lN8m!Ër5ċ1_⛗Y\{؛:x>qL ƧNa3 QyCtShT6Ƹw'ɗZ~`˿Zz惑:ւ{3r}_m gqaWY][vzj|s(OJY^୅qC];@^}YA^Fey3F[;քszW]LKYE8僿~ɥ¸D_MF&`ѷEj=\|=ZOG;}+1O1o8Kgb `vn%30Zɻ(ąNY(#7B雑]efՃsعFSjʡz .CK)X\Nr+(hcA9YCpjf1s0M+9|o2>CJ3M(g1PN = Z2ڡ8 REo4ñQoExrK$8LZ8U4(]hcѶ(6rs@6ՒjЯ#1h7Z\s|~}1fYy=l<'q=NFD`?[o}ƒ<|0[h<,gb#yMMszs]Y0zl:ǨZns\X JHBEx҆=YzJS A?vNiZ@N8h;7KX;dL3~4(^ggu(Λ.[wzү--P5yW|OrZo0 AsaͲ4Kx1J[VNYp]Oum$ٓzûwMu:0ҍ??m*ח|C)r*66Ѽ-"RBm3ޙc\`i7=N经YtEJmڑs L .<3r?p4W]NmR%U@,528M3M+X,{JATv̅qc+MYKI0+yDz:,IAۻRbҞgzF"cTsZLz9OwŤ^/~6h}YWhO٫KUeě]ԄFuΈ7HnPJ)J^ ۤ #c_ʭ8@8Gl&JkOx"B8<-h@Α!n~0?d&Mś,7eY\ogc!ڪ9+7zn)LP#xn'1@ Gȣ@~5zߋmt5Zp·*~!n߆?Z EN;TG3iBn=J@C^KBHĢ&sCl)'?m<5EPx6jAaLR[9v(,΍u(Oa\*~ߩlCx;uz ArVAaiF4h5; fk5Fd"= h ш|E5`trF)cC5}k)[@h⺉v0 buмZXst)ӂY\¬ YIV R @('A$ҨyM+}EFDN#t K-mdoFP8I-5AIWʢ+_:'^D7I79˾ ~0 ι5U<#?[3Vuj]~SWP:iw/=9 maxmƌ p3]YMF}sTs{ ?55ӌZd\9eV~};t7pTw*e0 #}Ee/p_nFg-5h KIZ AK-]\ -L vRwF=Zs +jh)9{oSHi,bPW@g4igqY_-jBBELrÉj(ݗdE? uТRCZ.CEC 9x'*Wy}) S#oqف]~DWʑx~˿'VU;%i] E07<g 8qq‡r VF_.sDv:Z*CuB+gifu y1\wY bxOxiVOH%<&*\cseacXbډt\;3C Ze^R7) A f75yT>ԯMk |3Uy GeNܨU&CZB[ͳK9K˭z\{Nz68E (eӚD&("Ժ!*t;5_?ϝn(NVֆLV<߄ʳc#R Qb•Zw}{ܵ?7kTFN++R"Kr?cr\qR/@pH>Iuh%tr4p䯴룱6SA3ƾSG%NbLXND$$Q+A*G)0"b> lWD5_7(;j//.ن6ZRcdnVK 83A76 0~a/b/֐$)aY㠬e'“.JM+[w"<(\q]Ι#}1h}l֜?:Ipq../BLqK><ي2 ݕS2a6>YFŏp;N&Hڌy*Ǹγ1 sٻ6cW ~Y9؜,$/ǁ/2%FԐ5EaO`׶tUWWWWW)`M_xS+%f8+ݕs((1RG==.ԜջXFrbU'dd]&9UYԢV%ZӬv z-e.s0UXWM ?R{s$ :U&͍AJ<9H 2ՒK0I5&x[(# $PY`ÊB9B`6FogR|;. I ) 0e-$R3 LPD(D1)DJ)?v@Wݵ3dSMU3h0uP_݇WxmΖSs9~:inn7B(YϷ<'] %h iPiQe{DB@1 dxaNԂ@!jD2sF"M)d] f qLrQcwS)-+ׂV<[-e^p ,/[-$UFؒ164+ki>Bs-d+NX[ qZv}|rC@ΟS4[q&?USEjTRz$!9m/_#:_)<܍FbcMOΓyݭ )eju%Lb3\.cG'ꗭU+cF8)2$Shٟ64gtz\LgkXRG48.&^ ^z8Kr0<)+qHs~r.Sr&3<)7 (y2](oGRd1qIDP^~3Y('B0 h2ԫW)JA\Q9l+LdYF)c'9@XZ8^qmBY0xig84xʂ|#") %K\'Hk0-?Bh$ $zzx8Η?#H "-My]Tx % OZp T: 2LKYHKmsB&/۪/~99A ր=sJ#@-+V؜V1f[Ao|ܝU*86/at{`; M i3jA ‹q܆o{3 G g+BTEu8,CsJhCw {G k|LIn.$ %\Czi"`jp)tX@0 0NG#pYeif?Gzp{ TZڳV',uG**Bi@$弨7˯Oc5!+^ A egR-jՙW@wA!3CMdWS Pś;D'A@[ ~㗼2G/uYKÏ㗼&'K^P_?R4*X%Pe(5c&x \Y+91 tdЁ蔽TJ7ir_\,Jw jb1 3Qބ'QAͪZW$uPƅH3~Lw]yP;6(U(4 5JEcjC&JjBjJ.E5i])ZK5%w;vz ~}v+jPXTX2[9w}\]ݎJ&6k᳕S,~.o!AdsrA|eo%l4Ʈ.F~2Nn]Q"{@I+gѵzŅBy0me``k'#`Ys[0\nԅQnqdBXu)+@A,Z y;mϞ= J$훟o~6cWOpd7rb\ލz , fr3-tٳ!熑>B"L?"bgRJ`3OsH{T&FnW*#Q,iϐKA+dQX6,\OyZ7s'q^ǁ|^-x^žWսyq4Ei3PoV&YG/<wn;tQp 鋂Se# |>uu-/.wy~W =r򭽆dU9GZW7Q5Zu69<#4=-Gơg[#:dT܈ ;-68jGlwe6X1=f̞rtcھs-1^3{c56!o&iwkʺ@a}s(Es2YC/5$ \ mX^k`3{{a4H\})Z8zQz"^y3rؔ:0DyT/`Ouh6߁7r34qCV m#V!'/~mwC[NQvktns{|Iz7 i0{y=:fgiu3mD%v&R"͎I{KmG4sZ1p>tA$FHNy37wC} Ʈr³K^YLg'G|T>ox6 ;'zmڷjv: K&۫kwv[p7kWQIn&_[|9t΁nDfԌ#վ/)5[Ζ{5pdUG|rٻ/ff>s3^>.MuOnQ&$ꄉpB:.IT\-pPpfn}$8h? )+R#DO&%R P>9J%96|,.Uɲ2+ qwF]΍]ASh(`{[®|nܤQ!Tf_ žYV5nН(`+T&M_W%jd]I"Uu%ro@oߪSjcsTcw3LN7*ՎGnWo:w&l WEw7{rsLhǣI?Ḳch:X&Nr,w5h紒>pZg7X 1"y;܃Qxk}mg:xZJrC /4:rVBlrd`HMINpG9]2T"+ :]^"#  HroF:r 91VK!>+5t;',5Y W焂ՔYऑ9H4X[J#>IZHmжSfS)]inrU/!%Y3aR CZO !ܤ`ZjMM}w_Tɵ*nN#< -Ȉ>iy5HR&(b^=H~d3.4~) #L4#)xr^3Ho^xjv &c-_ q|[󏖣$/mcg3 $'\/M)NJ= Ҧ;$L=k[fP+gh9ZZjsDQˀPZʡCdm3ꢈ9RtWGCQtׂ$ŕ(}7Jjhn]YnLժepvnep8jwqҲaa0KA&ư:_*鎾{s''CRQ$0| <4T@/yw2~ 1iQg!tx9Nhi9py(r΁(4L禱$(suCS+԰3}<+S,ԞM+Φ2e/ˋk+0+:3IS ]T ۑ>UL(fi[)թ]MlZؖ!qXM3j_Q8G(&X?ϜkegI ml˄Ka:]ߢ$*v-CଢVo3 qp=6Q^9g" ‹ DfՍ+QэQӪbLz dILïrba8l0)cH&CFlL8jYXLVT\RX8X}r[Sg :\Lmp)t\5+#B-59,itߢdjmQνqZ+0 *lTr_\,Jw jb#2f s0QAjT]D@j\'^GZi4T`;$Q!"h0V'+MМ 6ٽVl|LJY1L^0cKζxbX!!Y^cM=BؠHly;`m"a Rbq- t6U%Z#rXk[Ïbq{?%lWStjMWѾj_W}e xqEH6a%DKB mNa_.VZ' "*\H( Ve3)&`0kgP@'B*1R]8jL=(: U RsdI-6J!P+o (``~9+dk Vv{ x]rյvE~ϖ8"^lyu.TZ`/,loovY]e)O-狿_|I uP#Z/S\ ݀5z۳kWgkgjI0^bD>w6v>Žf)JokYRi 5#!\Ddn ^R16hvn)iQ%4V5!!\Dd{["Q cbl<hT bD':ڭ;OpienIvW?{O֑_!eT]]mXdmq0N&_}ZIo=z(~$_b4]wUwWu*]2%ASVͻVຬ.a1 ]S#Jw݀/CLB)[kyݖvː'hz(-nֺ$*aԓNnzn2A;=˾ut:yCz3ptMSF=uoמxcd=DOY:d=% )( !F*?V!?"3)&5A6*}L( }'z.!䡱lmON&6^Ho8S6dԖUGN.Ⰽ=(kC!I("{I  U(šD&s,Q0 q),BMۏ:QaZ 0^,s؂p٪f]EaJA,jEhPUr <_e+_c.I#DOQPzPeTɤPxHKER: 28R֋.HJ:r>$6$tKrCÔLVJώd`iWmjB:dMhz~݋ cn0ejpyG>\Fa,;'W;^Ѭ!Z k p\r R%_{o}GᅨI7IӟsP9b_q づqmrs?8XN<r:%ק%OL4'O,+6hS@p3{ɋ4F0_Q!?ߏFҷF.N4uȢ$es{D_V^ +wgָi 5Yv5zY=̑~Bcn2E$ZJƹAԠ)s]6QD=~,}gb+b4:GA^kM4œ:eeSvr8̧>d R훠p\cb;56*軠SWQ~ X6q?옲dpE;ne~͊vKd#sD[Fw,Cad~рw0j3LI!IA%OBʪ8MU&QE-Xkj#fo_73k4%܉ coslI2"Z&Gt5 ZET,*[vNhVMa #TH@]a^SeBeb?͆ë&s;_Sȹ8W@&XZVRK[]x~EPB%06HډRCDG,y*_QQk,1!I4r!f^hiE S yH it6-Tb/C N\L mC~!)dI16Б(V: K11tʤ"R M& $ ΂B%Z#2Ao7C3TL39 mx=ܢkDWo}]Iij *aN̫d4U؝{Ḷcwyc[SZw1c߁?`՛EO~-{5Vix}vЛTz9_N/Nap_ N[E Ž>wOdݡ0V{J~?o;4Ny&vh$v;kxgںmOT [+zp a9ϴMhtr+]u[{\"3Ӧkv(= Z:XrkRsL^ 6_Dmٯ5L9}L` 3hǏATRڽˣ'%)QR1 i$<cWиd Qw뽍Ή}ͣn9U=a (6oE<-Q lJ;I8T!'Y<*%rH=$fERdf1"yVYRk-&{Lk1c4pg{t3?nG7c=f^=*ǴU$9 yR+IXW|z 9HP P+l9=ג64a9f Sx+1EH6X9P9ƅ179^ .OPPR2=d42D'Qۤ#{ֺS΁n\Pe٥>qa?uºqa(m&u 8tyGA΢֡7B Q6V{+Adx'ّ3kTw]s0k%RU* 3FLkRf]eK~j&i)GKZ'PVs-QVz/S;^ Vw]vR9ivl#^ΪgݍgG0cݦRxB+U56),Rq0Y'jV 8,eY {$tސe&;!'[C +)#kycLkyƖ'3 BUMK)9HVfA[9ؕ@a& TTdh+h6:TdXM_<J`7³4gezJv:"ʃܻa1Y~~poUS&GO6)=!?yhJq?OGb5?u?*9v6Ώ<>R0 +t`s9&](X_fZ Qf1tRU-M_8u0K]+ HrxXc4W[ebN~#ǚܯ7YnfWUzXU 8ؾg?\h,?AyWftRTeF'UIqpz2 +@`l$8 RXHH &Z98RAaL/5~c9Hm7gEz0 =7ȣkVƝXF5{Lڢc4-ܬQzyf GQZs FWz|.Lk'U^@AÆ&ϰiGj\hhH$e4I%"RVr,!jkِpUВC )%HRC\R%FvgOrpLCp̸i[8\7 ]5V xɍ[erj)՛|N:9}d$J!X|:fDt MiP*QX> -#/͜Mn5>rGq`s >c 6B[G;c=,9(9.%%q .5]b-k 6J<[뫓3?h[ z?ꐗ{1y?yoZr`pp  6F_g?W}kFEvd/ Yg1`3d^6hjő% }ݒ}GOwKn$!d]ՅoOj3j򩋬9c=8/Oܩn{|3JjEӷO.4(VNO\gc҄b4Cp4ef'Ww%UZ7Sqfu8POcvݘwap"*%wk~j'_Khۛ:7\gpuovF/Za"N}j(rl :L @j6P9fofD?Zkv ӻ+яDcݳ& o򃁬5,$8Ⱥ/nY-nY Y:G1:kn l I&j7vpKP!kD k]o؊7({$򐢥a RH @$QP@O@IX,C-.c{jAʬXTwGpIK<Сe0n^5B!99ҡ5#6 @2j9hH#qO b8P[I@Nʁ+qS hDKɏ21IV.)"4*"3 U*R#mtT_uRߦ5d$['8d+۽QïTgޕ-lKdH,`ȿAd6`D!IpHpx!if(XD #lnj:R,9%xf9T)6(jYm2j+61*b?O]3,լjJ<`mj*%ڨLzptdښ^MjT&yT:)l$6vXU#駚y:&3ĊVT4aܳ;+e°z.lJ£'۔ZwȢ$<WP/|E.=ҸE#ܮ:-*oc }pWL?пJ%kF[]O ZG=?c%ьUZ9aWc#3XIKucOjhՐ$JhO:xS~S.}L~7Ȉ&ǺXct4Y~b9Dz%'Oos90ƸN^:7 VdKs󽜾_Ap&b ~ou\[_'kS+GݤZX^Ṁ._YVr]Kd]$>c _>\\~VҎceoq{fRxaivxMpv$%֍Ed)L>y}ˠepʼd.#*?)+mNB:ǫxvg.fy,7CYw }+T:ӏW緵­~בY&>b{gy gW/ER vR`3!P4\l|꨻DOƃ18`?_8uٮ`Xه WUALz 1dklQYTm Gߐu+%(Z c$os]ysCZV 'hy"d%Ɇ9Jk^u2gԶ6Tunnj884|U\$ ?wۓẉbځ\'@GLG3K$@hf:KvN~w ̯AܬBܬBܬBܬ qݫ:H>]b-i1 U,^Ĕ,PیA֥ohU?Z?P)Ep@qHΪ#9:#oIڢ"%D&4F"ʮHZ O~sC'C'4a}aPqz*e-Zt^*o Br _/N 8"l%nAq3C^$M3J0 $):)* :EWEMQdB.vKJX"ot(UJ9dr$YuZ8)S-4Y"O 0HnDR&EivKEIPQLAвQfh B 3sKi!c@DC D^O+s)DHҰ պZj9BSuRj3'AY 7W穊wqˊyQZH-+Pp,m-ԚHávX ^w1g1 Yi{K[:gjTu*iGW9T‘]sÙ:GέVƳ@ ǺG笀l:'WXnHM9TNU$vчF;7kwn S#M6صQzub7C6nC@C Vu }g.Z:C4.4dۥ1Xw?_83,}G^zHԿo/y>d^HsV_!x~_eufEn4M9ݙJ1vwy _y{~$ s<qIWɳn{3sQqzhR H28Zm9g]xĤ6#Ph{8M{ {4N )?9ۡf=sO5I.7_t}UE&R!)^XcAwVDY蔶+j-I7?ݼJ[1f9hVA?{P*4!Y]kId9HNMTkP%BV`.̻` ϖAgƼ5/YCv@jeȂ&KĈԮ\*kw0}H"л̼%j BoZN_\M __˶@>M"m!c(Ed)&Hǿ(gSڍV![8Hn3 k"D ZN'e@UrAb{x +;^t_V,X {J,C&Io(;v'AwG>#ш#:F wug{|Jm?Im4v)A+-k!#dF%|6Z2/s}XI! OdJDDe۫o[9;̌%:f۟HkѼ.'ٖ<`'7S}=޴LEٛʲp c;G@7q<ԜwzG ր'|Γ6SQdAd:DaTjW͖윢fx;HzF9 x!HbG|ݟ| htSfJ򩔓STqj R"%[Rh#tE Z٨>ve^5$4VbKttR؇Ή.\ߏZ>P@=ikx\bt:g N8PߛY5˳jg,Ϻfbe2L42'5Ȝ Ũ5DVaG -Lpk jwfNt(.*Jah_ @&m2KK|);Qֺ55fiԏ8Ynhqjj#QG^}d됰^I]N9@m9lm8#HGn$ ثMjYnL;ilij_AZH}.Ff5[2xy d:)Ȇ*9@3(ׇ(ƺz`>ӼE;LYU}dpZY+Q蜣% > %eY*`?P6sSF:c:5=WjSe9~'6V14unfo+^x $=^ $0~@ sܟaȬe/m[Xu./>Xpִ{99nMbo ٷ[³Ǒ`Cd/&Ȅ4>Hǿ)OFoP̊R[eje:E(;WV{z&uR))r ^g^,0 71oX5b >Ap ݅x$N&[ì~#'ӻCE@jZpyzH6jQѯ] pCiD?ji}xW˲V(8`xLa+ue/ޝtfկVۀSf_oobq!;Ih~Ѷaz?;=/7z_',onBڍBj;ҾRӇOHHBچ7Qf?}~Gl[pw},ŧw)/c5 H۵Qޓ6v\+^T=m9N7najm1HdwM˻Pj wK]~NUs~o= UR)̂ltg ͔|2He"x}20uoЅf~\}kԵrQ&NVQǤsl}"WN[%E]Lɀ )z5ͶeLi*ο/}}T>_rtӟfczmj!'_^|%_{F*SZvզ.v/ d, 8=\y2ڀ–hH \.d"~fҐ-K6*ާ,#ʐЌC!xNzſ'$*3"=Vߎ_}I0FpX6Ā &SSS8쌗fN* Ge!&Md&?5L8:Ⱕ4V]ͳQuO,e뜕'i Jn"SbEN`I.6I)<*1y Ӝzil@x‡ 9kC rn=NaSZsփ\LH>#!_Hܓd\<`wj.Kx^o4:́b »Y`zfxx[]fVbl>/x?>;=vsAueԩ gi2%/αK44׽h׷f7|FqSj0Lo瓚1ܹLܹLܹLܹ,s\|u)?9Oz?6>WZdU+fZU߄ ]_6Yg*%Di5pYb:p+;s|Weg}˷*V2c.["޲HfZ:K@UC,by3&wMQ$[0s^EvX^6?צA787 זcvaV|!l|*4 yo^[Is]:NYu֏}+g I+:2EKMs޶ntDV9SU/-BY3[ELIthaĹe9X4bݎPά[mDs[Et]pfd KwZvN$qzwLIo<\?M兽tvT1mbW'wfzz=W-G^PF">u5Sjw# feYyg%c<`]n^ 0XZyz=uiql_65K.Rid~c\lbHi^Z3b4v_"}Ȼ2^'-o'uءh'E 7}݇OlytSt 6)m>?R)!ӡJa|,P4ةA|1pYJc5mV u0::l)yRqcCn wʣ8գ?cpAR;c#,GQFXYP;Rbt_nAWҢ^L_`$xsMaq,)e-Ok`pfk#jup˝*/w)Cz C` >7PmJ?ɓXq跛GZ7Ի(?o&Z}<;3p:Ry0P#MOnV riWTTLy2 ]AI5pn7_xicwu ;Kn a+Ujp¤Z1&4~tA!}ԬZe.tZ2yK994x;u~ѳ¬qfV';l7u{WݘI7K3FZ*rդn=3"Nc^ǿ6ZvR(b?չM!`ЩYi_뼣%޽YPIZb|ղ%Fa!{ZIJ04ݮuKYì{ N93Q -v9"?%a~-.g$ZKOH$/(Dп}HwlOh !4QQai ,%QCY z(\o_Ege4Cl6)eY :_!l0pn^¾[-F挊Kj8shf`P)!O@^~"&2$$PƸ+@c{ҌdgUl˵kWԾ~v>5]]瓳SL|VT(֯(JGq~Wng(pq۱r^ȔK23٤ëUC,by3&9FvOZf'9t;tX[)-txgO:\\9wФ*UDia茏1Ȯz`Qmn[5"HN}SkQaܼK9٨xV_ZMi<4(" b 05֮nח9[9IQsc3Xzǩ`3b$UVSc5;J̒wrK߾*c$HQ4*l K|`Hg#  j\FG@JV{ץԌgR *%Go,#6ёL``U7< y?6 T@^*)q$rF{biea0KD`)EsBD?j(Z"} D4(Z"py43:rLポXQp<&ԙUNۿW)&:ýiAq"-/촤)1 69.`,ʔ$2D&@/nG j,9+kD[CdS7)ep J+#e-*d(|S2kp#A6RH0 EW#-4Z>3s92{ܗԴt 'o &Pw{H3O0-`"%Xfr~Zbsާߝp:/8u@D6JGewmfoG ld8`@K@xInG#xTBk〩H+fm>N lj ﳟyO]ZE 9,Hb؄K<5Q@r a\2XRE6bj)qp `7.#ŹT\[fLm?N!򞶻 a%mefGhZ;h5^*V 3%kuS JwK?t1X/6a5v Jz|NO% IGg]ܫaq ![K zT=iH JɣY`RwWsIqHH?.o}gdrDGR*T v&߷+OADMbӞw|̢%?:lbЎٗƅfʰ7{V<ص:7U3ę7Kǁr}3o:hܛP^a"U˵o>G8R4%.%EL aj69sƥB#QƑqdAUFV2$ÛR:wO5X8*O lR6#k6y>?ѶDe03TWI w&wo4q<AD/T %y͆Ep:WXbol<|=満ƅ^I{ o'8s^SZRix |Ɯ඗@HN(uW6Y@ɑ 0r*:L N zD*,y_v35#jnp*T+LWq&ŔvNl2( 6)aEgj, ap"S-;|**\muwy'UHD%٭ZI҃WrvS aV_5oSLS1P&6KY{ 1wXj՝3I LB}(۞w_P(1$嫣&H%YX")[B_f"| 'x Eq[tDN-&J7ˮ6_0L{ h NzW1 5t^L OΏROM/. T범h٬ 5L3Уy,xuؐ>%惮Exo,8I;8=^M/Uc˜M}7 US-]Cgjʶxi6:`dR'3fqᆈRe*u/xM$<'l#R-fu@B4zrfD]Kz9pH/2iB_>TÔza.s:Lf_;Il>u1n !tt>,]K^V5t"Y ݩ2?y0WjJ}az~(&w1ЯuCwxV@f'vS`^uT=Z.u\AԅRS-IqC_GM%I *[2< ,}DJ<xW!?7Y<+x@H!9)4Lݑgc,q! :'^ͥw?b#\U|ս ~RH[Q7jqfqq=T.[@:ڍ*GXDbD2{}9 .SR_k d>Lw-;/&)2q| zFM1 9sQRszPsQ^PœQAARe.06ЙeN-c/"56@Cjc9k䜆|DzsYdxǐY'#Ħg= ,:I㩤SAr@X0ŧ.0$f[Ul97eĪ֋x/i;J.kqk;#GE_(04@LeJ..ׯnn*.fև ?y璽9Y'.f&LCqu39!́uaB1T+O΃ytZc;)xH%obeY(X ˛.؃}X^2q|/|#+l4g9r^{*p)ӈV+ud o}- qlє64]l>!G*uYJ4wlM)ѐvqյ4l2ՂO֬v^ڳ[ |( ߁r~;z)"ϋ/Q"NqإNq6dztݏŐjw?ߵtŖnm[-n+T9d`\2T\KAC%01i z3VYXfDx?zpduG4 SM~Fv|)<vF~iuߓu2DԞ 6J]7VD+ Y:&z*>צѠu@DZ nrZ6 xzh&REtr7fTk>bh9`5ڛXkq@&؏Ny#d֍hm%hn_/[5h{yyR4 k0%"882Ya9sޯZt_7GhTQU 8䪽ptho}@ rޟoBx_x;_?o oO/K3pwcnEǝ&Q+|"QNXr qIb"9|䣝LzrNěH,&$([!6t #E]D`N()΄dLzO&WNh@QF#y'ʁgarn^\:f`DCTXFu{h$ccQw8 Y) y vFZ-.TC8͑%&zwTpBx)%rm{b /s?fh6Sx*B( UwH5 Jֳ!֬0r=4!Ѹ 0,8N+*;jGld!Bz=FZc`z*ROΛ'4Bf`;/KkzFmqv)ٰ)?ci#4,b2(*ͅf%1fSmGZ(.Fq]m^|# !AP%BĖ @kQ"06N3 7l-"@= sqcL]1lޢe#~Ƣ ?A5+`P'Uǃ`y%UabW\KYbӄ2E% -JUV)@T.Vh7GGFs2QHx F98L_;Ycw[xqPRp- rǤ1+KoAmřhPB˩5։s%<^N霭ȸ#+e&+M99nq T Xʀ@aecvDzR.9z),@,4ٹ@g~}ʤj ㈳!ljBB[U2oI uഄ;i,ád R deUnYr`V++ .%wR&TilF#Hےڍņ "qA9x2Kŧ18Z^a3|ar2 }+DSoE"| A: z`_)[i8}A0#tNON85X̻ T*e%^{BBiAP{ {aFt Gyk=wcC&h2]O7n^K{dnjC7;,I{03ܟ/QOq/dx6G"|+~_j#NveG%\XU W@2/q;F+b"`*`R +xnJt PU`K>7c^Llр)/,&!wc fR5W ڎpEO!]ؒF/Tɇ放L)#`Ct(N X:aT/P%`q eU*0WՂޅ`kY sxIɉe3uIׁ:钮okSsr]ml`C|wP2%xP2%AI$[y$xb5871Q}.j)|d.͇I׳/8(Ybío"Kv\7>0[sl1Ѡ̹R|Uw6!;9ɹ>8D~dos'`E,ʷav)yv.WOyy… Iܠ !q BV *-l+ŕ8$ՂRɈ+ fi 49 }=-g}蝟>힯7Fa!uR߼EeJ#a?gpv(&iT[L\+Nk1YM>O.6\&ןgY<8Ȧc VN~rI(Ҧ/>ۛxY3_N "&O#xny.&M [fk@A@qK20PA }/mQ#KxI#̳cBv/289bg qDbضq9hKݤF&&[&qi Wض)`꬏q<}xۅZ:N*l9,<2t"s@2EAw[CrA>6!rsd[EP,bm`Wy*۷hߣˬ6)LM%c,lfvx(JɠoۤRT Ju`Olg!mLob('v J#|Z'w]@C_792x+(O#I:U1ePWCN xm!|ѯ!$aSIOC8RJ\pY :R"Z.vC>D7uTg>Ҁ|(ģt0OÅ:}XqlI!G{HyMMY}uv`i2AlR-/^(.EoCAp2HRuVضiL")BDpᕴ >)N̄t D+H~`B[W M;y^A9kQTBiPLv%mcЈu h'=?G)Ի L%͛2(<W)| i("\0y"a_}\knPo>4*S.SIr0?Y+Գh#\@X&}~Hs dxlˋFѰduSL#-li4~(lVCLK+临peŔC+\Xٟ0NJy_"2xNi7q_8uR;w=&WIK"r^m4P܇v/B~x]P}JB08xf k_84*gF]6>y`ӱm ٶj|^Ĺ8g8U Q@wsR" ܶr{+=/ñ w˦uDiiĶg/@<VkL$n9,d2ŸQy6cL5?C XΞ H)nvW\DQ>sޔAz[ɒ-Iy>JNq J+mP#)6VYT|8+pw2t(V)o]Ş0<)IZyd +KJE2&OV&CQi%-[~2#~;YP;$VfI, ^ֿiM;% s6#(gK:$<,Y?q{LKfLnjuQ(XfR֫V;t@2zЃYI)ƓT ɝ['Wy;_/t-2"M]/rwPKuF3 OOMQmN!n9wLB=&ySPO lL,_a* />\=^N)[De-6Lfv}3{u?nD1YӁzZ^UCr j_fS" 4)}Jj7Z5~-_Dޯb-꛷HF[}ޜ Ϻb.OpBԮ;&?* /yFѴl&( Ei>bt|6@S`[{r{u_ӌ\O/0#FKKŧg~r|0 ׃'CKPqDTϊ=_rv\Ωr $kl|ɡʈQjLq=4Aa2^QSP* *8F+DeG!p0بYbpk55%C! Wp]M- 3Adm8Ͽ3`ʳtYio\'4Nk'Kqk[G%ւZBH9j_iq 7nTgeovtp8dG=DooP mW@}Ek-D&)pd Va`ʊ#@^%ՏerO K>հjvai2Wdq:}L6ɒ:hisn2A_s=<< JÕyprhl JNBٟ fwܾ9ޘo4JL,U) +Pi2C+U+`x~.%iwdz 9dokǨtq=2IF",OǢe31eK)r'z+d~RG(An~7ϖ+oD΁W<= qa𬟊\0\!Y/uIn1H/A#:H/Lr$}eKj(X` 3ϥpR(P8L9&m6$`]t"x*ivv?b  7:1:xu<5Y OICF税+%w&j `,<G]ĺ# [IZ,|b|z8V"0PQ(CUici\1iÌՕT;B'P TgX!V͌*VKRdP\j9&%hSm8+|@:h6pO%kwuhArE$ kdGBIFt[F,kUP)YH7YP51s= 7JC-dƶmL$n^[Pьel{ +Ku ;V#O^(%>nsBAVw҈tȭ8n}@& ꁷ#W 8g߾?cǾ_PBê9r\98>I"x!V6jKP C)/H:l#UŁ`r8 [~@썺Q^I7%6Ga@|]VnZ6 X_wb,{#٠>zͪ}UΒU',բ\UûU^}W%W-!ffs}ׅkqFK36gɞ}9g1Ѕv=(mYm)K=[Gd"g󯤾Ngp,=:eڟbUKV Hm@Z6EZ +㸿˩͔p3w+RmԆPEd[`3 sBWar )Cc8J&D ji(c#7J̄ƘP.gצ)u]ȦBO$B(04"0 #q)f(Z$4* -":!amr#D%!zp͏ݽXr/qTEðP"L5O)Q(5+R[)"NTŒ1CPė δ.V.)Iey!yʤ06"?m0OzZ]{[51/fIu MU ǻ\}샺ا}cOf}&zӛ8;;'fVB~}z]je6`L'9}:2 +Xnͦ@eƇh1%JDI}nu1p:uQEug}bn;ݚ!5T !b2t꾣vЋNu{[펺j&F7>DKc1q+gH<̾̄v<,G.ۆQţQpsw=fj+Rp _m}yX>gfrJߵ;MX_ԗTAynb5v5rfܬV|]-w%N]Qb؋aZ7HJNg5--ZHt@J]`Ƶu8RS+H Sin&Pkc`o8[ O~J"{SAW㽨k{beb3 5lVӸԈ$lpKD}1\V6O]O\A6YK}qz||T Ը ؔO$TFfDgrT]ڼb!?/V0r"LXtY92Z:].-fb4nTbnO6a OnBF.fT>>>OG[K۶rih?=mný_Й}Ǯܱwj3"QQ LR(mHNj{T5Y+gu@%8ǤK;hm. ksʵҲA'_¸^4,+zn$ۧJ*\<]Zꀊ7XF wV >!` ׏> qq:AJ(o)^wޫ keӷrpspR;c-mWTc+ mz̾݋" j/X~GUc9G RK <$T՘Ay7?w'rɸLe-,^jk`bאZ)zQ MuJrE:-WIXH/JUZiaeAwQK\gWh:KV@$ t8 BE L$$bxeDND<г^Wrq[ ENПNo%`g5͑_-3~N˩騁*77`EF^̔wES'Ok8/.ssJ^%۬YimO_"T>$y2*[Js?Nr u_J:؏Y)yŁ@cܓ;6cQVѯZ_hzT~U,ڪF_oTbu_mE o'Xy W)X"sM(;.!R Zc>X[og&Rb@oei8׎}SMln^m}9=`pZyތ'3Fvo&/}]xڞ E~Νx] ᛐ1ji( @1&:NA"q+ƑI(,Tf(pO֢pPQ,N(q("3(1QNBx(F&XAgCE$Qdj'`(\>fJa+_?nVM=uϷmhWJx.CO`'p yzB*.:kd_k o/GʖGtl)a6َtԩ`Rl"INǣYm:kiio6gGP[l?<ДB4XZZ",ƣ&{cm2 K%f)Unf߇_f9 csũI'1!0o͙J; %@r 3(OplbEgŪJ3dWk*;[j4|mW~?5mǰS0t'1p$@ NE* Qhvpuz1Ӏ\-ZQ J`)LIjp*dc4I"J@$ : y92&B&!i`/p|&&5´5N)ɍ)OR%PbP2$)bHi`$4% J`ɿV`SH  |rek|2Ą5juus85 B(o{PM+ WZj'KlVq[o>UfކjV!o"عht•UbҳigI/[+{kS}REjPU!뗌hxي}%OVCp^_x W|L,^z=MG߭Qu)Qw5:Nj_w!N+_C7jAn:,V`Br+˓Zs |SX`ˇ(^ m_GīM=}\ݮZKv̈́g3qXlC-}ȋS Ttwe R0P3BM a> ĠNcy> g1𾣇ZPHw:>TT'RفZ WŽ\).Qw5"Z6Gs%s.!IEqE""$X$!M24 c%]Ce1 gMJ1HT&HiLC -ƱDJĞmdBcEm*.qh BO)S*{, (-΅"YTX YTEͤsWJK\dYT|K J&aJfQ\ )IY8e,*ȱ=&HPuEEAN:>7)=I?n5PLG:Q/KW *K'Y?On6f(+d^vl;r5 `1 <>&Gr6LS[-!"mkC/=8:[E흻 0? 8X#'i,%wS3nn~FG~ӿ(w"C L!(A:E s`;ĝ3\^D\ 5'iljpEi&B3_6,^L̎ Rr}~]c!sݹ7S$Z"s$}E9՜)mUWpy_p~jE?LpҰ&_Β$-& "EQv1 fi4AX2GXņE^ĔW2p.!DQҝ;r+e\&ӉwyxpۡX`qIDF)"<(æˤO 4DQCKj1>4LGLƲ=2-d:k#ymu/6Qݒo4/g ~W߉wEx"ZJ';Rlg0KmFfnT[ovyfuMNGۅ& VVՌk}wegG>|ьf˚sev;6{( av؀xYMmͬ;AkMgaiQ Q1LE0!Ҋ>~wtm>@_GjX==Ktpj]?v]eY1âT<3S/dG 4d ckapGb/|a?@a%spaPv[Bq[VNcQ_Fذo〾nyYP3 txsI(o:45Hu[`$MWR' dV&H4ÕFK]ݠD$)\%- g(W#N@͓UT?b1xD3\˫>!RS;cHnʺH)VL41M!R 6qPn|6E[ʡ9?{WƑ_}>]/b@SFAEW=$%pR6%6aOWzꐂw8ȏ32y| .IH;D䒂Jznj1]&g?C%*m7x 7 MyWB5\C4#0ɳV ApGZTx[A(gs @dk1 g <<ݷ_  g7Ȗw[ IZ|lEO_5 wv*h:&-细_t_Bbݴ: D 4-r ܫeqv*ӯ6p+p;ʼnn'C`¤<SN}83׈pƎl2ʆ)I-VdW۠A ux;9+2d2X dvS.31  a0"6ǾHq0CpkSBU!hgpdAXuԈE)"`]p]}|9!9 HrPa6zEtd(y REo+#A[ PEZdNMɌ69"E^Ј fE嵃"J'^pRz:?”6e-D-㝹77\$#$co!}I\^a^&B#^lD۰\mNnurnb7= ,rI"j,j=#v*8J( ^Gxh J']U^iI0, OTTi_VRb$Ap3zj:K s*g\/`B3nG`8,cA،(p LbBu@FNJ= vQipݨżTT0j^f[5l$mMbX0w;É!حtF6*JDlSSKL7M!9((|19 z%`SۻT+XD\EPjSUJ+RTHQAV ]0%R)]A1"1MX/V#ʫ gBtt)'HOvJ PBaX ) a`$`7o$lb Mg*)dS^hj+,Kn<(@+AdbpWhX"~$&4 |$M$P 4ܪu 'Ij jk2H%۾UA|\X(þli=T aF0]H!֏`$2B)gJx!oI1,r2|ri:"D5N>r+̜Qx9@j2:C0;(g֙"/30b6LM)oU[r lU4{W=EqY>OOEI:䚱SmKdPGCBA[Ij9u1&wTl3OMV2m`oVv{M \H<"X [ec4+AL }$\Qa#BAV:#-#ښ iWiiI} uf hJ޼H}]a 4[(fLcj/Z!Qޞ,?ߵ΄/}.WɄ֙|@A%>oO38EnSI[v,,vtf=^O\?'nd{=NjL4MΨMgg"9kOkd gqdlOI Sd.x>2Vx9)ۇgs5KMd~-znt?7ωZͺaIlðx|7`3ӛ_c WM ]9k?xϧ=b'~Wv"OeȇyΓ$nԛ8ېvf&[7X7R]sq.ݻ{9NQFY ɸ d Gre1mN0_9ۧO\iQ-o+T5c‘͗\)t"[3=|j5ƎV/G}P9,3!)=~F )J4rEU!)1)R9#./R?"|MW xuS6WVtضvR4.#ž&>Α H"40̍?1C@XK~SPh*B"u٧5̣Nkd4릗OYSO1j RSeY DS`9?"0:{(OBW I6Ы=Lmk0.<_jzLo⍺<tx ,'&ʥR̮}ҧ#BMs*M]*"^3sO|n´<m?6=tܩR}sA(=>TZS5SRAi g naPt<sRln>t3N\C4?No~jL[Hts֨_#徙ު|L!i iLam6ȩ{,k2/3Obtء8}Hg);"a艦;_2m#9:59οf$IRD)¶N P =z%*j n/H)<QzL&)k iÃ":ǼJ@ k0I`W^8/6i-՗&LQ`2 R=`N+)PVD0P8 ӆ:flqɯBczΊ\^\u{?ۧ.go֦YË!,U<RgTD}`l"$`Ak:cZ%VdΤ*;P%pTi &+թNe':kźH~`]q׋;X@"T}M0dz(mw8PgʹA\S;b>Zo)f6 t<^ra:_^\vCRI.S슇Uۛqob܃ӓcZF6x?_a¨xR$+}A24Uɀ-e/AX:娍l""& Xe()7 oy1#y 5,2[FR|q'-e[ cKF丶҆ # /hJ,M69qZUdrZ DRuܢGOZɝNΩNVb&NDXi 2R-Z;3WKmTwgԡy"tny%R:̯hZ@DXvIUsv48n,S\|&t ם0m2SR9-Il]*пWtt;IyZ}_O9NIy|+Oy{1f'qdrFjRz4/^H+Ta.3!w9~OLI!O1n,](?Y VSrruDŴEh$HǍ1 )zxE9N` k*1ʜJ!%cW"Ge Ec P;$FHhH<0=BXȣ hKTү DYEuCNR/6sbNi!4:60Fe%S$Fm-gdEɧo)n4/R(o~WH Ú}Ԫgx#ZI3Tg ,yJb M8&VL>ҨFN` i6NumY-}o [\ݐ~A Wd-k;(ȑyɸ7O_z1@dPvOo87SMf4KY%#HM>)鈗7NrI`R6:Z$Of83XWC/y FrĽyB9tP$Ah(Bad SOnr~h i]iF>+yk}'uԞg|VqI!N:VaODs 9R}~u Z]j`3Ng1=jԓc>OkƵM`_|2+zi^r}h6-G> *Ӈh?`XiaQb\p2%f<"Cg,dKbny VJNt_YFSXdiGk 蜤TRsڵX z=k|jQ|ΞH*md_{[yadB^p^yj 4%HynaP顚TL&lӶ{zf}ʳ&O:xnv1ݶ|ʱw.ޯpJK%?hy0uQDX~p+=aH+qr;e0ls=(Sh6{^N RxrFpqN4LrkIZCu k2bZޛZׄ,Nm(XimW{7*_m8#]-I[+@bv]93koڼ.ruڍ[Ez[U\rsk_ch碽tuN[ޖM>?2$Pև<s :g@Dp3H, V t.Us-cQR,pBt"RQZ2ƀQn@oq[ɀOPxh8#Fx*@% m{W@Y*&3`s^yƩ( U몧{ 6Evq Ԣk< ۷u|˫W_9ay >K5q&f/:(z-H?թwLJ?G$-jWX&7$? 5Ic\&yNܗ&uF_[4R$usF}ġ4bدl}h>m;q*7w،O5׍(iNmAĻO' w^UO`qGeVsb'Z^ՂXFe#0v}S9y߃潷眫T *%s~k2DV2 =D5yh`( "/aiE2g{3Թ1օa S WQ 柃=>W.]+ J&?4 hz3qX#!Գezsӛ\檚,F}XE>9D@Zko9BH |<ge#R~/ȧq=]8hz^Tih0R̮c塿g׃qh 05ÌHn<|VGG'!Ӏ6cXZ^\d^s}P+&Un6VhkQ|cjDmc(Q Q[ZǬNf"5p~sͅdN{&9ʢFr%f1k0!Tǘ=m` Zj &)4B =0A ,EL!JHB[?y9*$cEL2$ !\C%$NKNMR%lB0Al A \pP}9D jJbq9l]l@7 DO)>jq8UnSfG*L"p*߹2eP`s,JV\LӜKz{vi97d~Ƿ}lWUv 1Xv!D?` y9wiQ (߹_wu/? E2CK`|s-MEE fl8鿫= }n Hz|F3*Ob`^E#C y#<)-_@ڮiVl|r^3]%e%qq6ix==\y0"?fvbf0O\d~ɽh}[ܧ݇-@ ^Jr!rz d{x9\Bk1۷C1wI$Ve$k: buV+ԧ's o bĂ3c /8;(YqcwY|AOIk'6^UkUgTzܧ{u_q K'vyd@J5CI,:>8LMJKY&r_S1mL7kσ_77<>ióIlEn&yAƥS,_{ mz1-[_e2W*Ֆ)쁤pNa0@""E JLz&#zj=Ũ3F9 lTf 6C( ̂yt~) g-zG_fJW!_8nJe~q GK׶=_A@<5O:K8Bt] :=jvܒ5cj!emUCX$k^'<~FJ,&s $(D'=ڨ}' `jrmk)1q vU#=̼$mޙ(ONX`,ri%JGUjh|JF(*j[b&Iʙ%4%kS5HWU17HOm>-f9DŽ4(IܨmfJrDʅ3c oG~ ~*;@"<}ur?kM08/MJ57~_p+  UIRX}cY^aǴLL fuڙG83o⍠PBWpxM9:LQI']3H^Of*^$D8 S49]Xc0LdRGܙ`$1 7ʋs>:N^v7'f+@b$5XfiPhGIcӜfCtg RIɜh$3K%`BGR̟?"ʙHLuk^FɦIE*1n`$WluL))P]E,UZ$[U%G 7phVEO4wS[(4hJx2 [)w{ףنjuG淟s.^>isVǍZ#HyDށ3~"b]]l h0|2h=l"RbYID7ȧ_O@T[~voj{zC:]ՌyirM?_dzEh{)C~%~(ToF1|^5x1j1כ)w,wiwїJ3R7MS[U5\p;>_㢴]ɝ_UWVǠ9b`bq~:Au[&U% Ar@B%?NqQMU_Fiy&oC6r&bd$h48乖W%84+ gfyVV2 2γk<EMK2$PSA!*ksuu-ןɺKVy}H&i抋H#OU>ޯGro̰T/T4Ć2LHDCѨv-~*[㑨 BR~}.`B͉)jo%lj:~O_.k#9Y \h|vbىg'b6oQTJ(,#)"(V(UDUѪL Dh.?_VdH~x%R2.dfv0@|IGXU)&x@O GEA =NMN(P&2:L MTI$JڹA3FR3*9tp6c#`4J-V9߶=Aso{({G@"x,V2 KB*WEǘs( { Ya@yQ*Es^"ݎ`yo_O|]P^Ҳ)Q5GSfKjQ=qDeXQTA`61JL8ZVzdppɀ/,SWVk($촢IJo B&B" ((Bsk3a(fq);QqF.Szu'R:Eu6F>&-|^̋6NLXQ30.0^:Ş Ju3HCrߎ95O B%Tu]H[l-b$O=QN^'RXC 2Ӣa7W E[lQB,BetĕlOwu\jLBa(h֤![HXi:鉯?- P֤@N{ |xJ>.OT%'Uù=uyYXZ{ݘ@a wř #$;br6%vH9X3ܚf9RYV32LXXIjzb(%-Z "Ə^^ GK TP>I^TB4ÂW}cy3V(/5/KzlĩJ@t2FF9\D]Z+F+]J`i8>S>S$"ݯ@+.S!9|^2Ui]g+S!nY"}`_}e,a5 5B1F5"Awhvļ|?jCi+()> j0VEi2 U"$5)K#0,ϳk<e Z8jl,(K$c)I OpWEQNy8V+>Po˵YngmWl4)p3?M:ʢ?g*r0t Ynŭqct ׃D!8Sp(Grf*`2;]lAfnoCb&5@Ne ]8T?Թ,w |NJ\L[ 8ݝ8 Ąk"#'8#* ]Ziri>NmgeΓ;],L&JA@d_6V"VihJPq4>Ȑ!g Lբ!S`Q2zA1IkBYOVZyS< IAXK hNp*y=TTH-_oi$G7hLԂ#VZ_)̒oZYQR1Q MnpMxl>[h4 ݬY<Cl.*3&RI jbugv`GHlx젅Is95$rei,51zv 5'J`&LZŨN4uaJYI\? &B#naށ%(3Vc t쇬²J]jgfvhc ⛲Cqw-wJ@b0$%̪,}TEN_nCC" '糛i^ZczJ(4BYCi/[sIOr$] Y-I;23#3\Fe}(]!廝*Z6 KI.ص<+ r20KIYuIu.,n5naZ౛6ܱ ΢ ^tkORɣ˷Q:^2Χ=ҊFg~LmtJ-fl} ]fJ}~cR~֒0B)H)P}bgguIuW2 G&M nF;nn*9rq^DT]RΤ~߅>ZdVCv8tȇg!t GRvC %"P&7 +*{r^mR@i)fgʓG }gY MŨ)Y|*$BZ1Up(kV1H.fK(Z[Z:em&Be52b5ťQ c?PheX(-2߽I<ҽKM@OWSUWqRduoس7Iy /*'s q`oJL% 3Us3*0CdYKwr R $%ojNYELޔ 뮪B5"}xI@pkT")UqK1B{0b8VZ2}isZD1F8Db µ&2>0DZ ka#Xmj0ZBËWLii~Q|̤ Tl&b{ Jl\ PZJkf.վ뎙tܖKD奈'!X%9} OSAC3kɘ!0ƃoIKL㥗/'}z T f^ SϣRDhsm(1 .︴T1ŕygk!Eٲg_ĺRli:g ^X\翏ΑgIt363iVyp|y( ղn|ƴ'~lڗw^Y3͋;,.fT+QE *-ϻ7T!JYN[ {TmRI &أLScQ2z<*,J,ȳ@R^^|.OӂR k͋HM <i1[˛"vq.oOj!P j+W W'(e}?+PIq#eVHKlfufJygi@:.6nc"E-2FBSLb#ITay#wl*[":)̭1s"HgDVQSIvmG/dVYϙNOO*k V81S_; %oP(-6cQ>o#6Vz>Ҕ&jCj4zipuٳ>j;284FBn ii[ a q?~{LzEJEnN࠘AZA%z's[s}6b;n2K1oC 'wg`N|#Aw0o1< F?G9C>t BмŜ--(nQ'4bdQ:0QN $U1b hNG`٘ 1cTRF`(ªt\1o7C5J;.O9$S#:Pc~:pf 0 |˜T)nTLq$$_Dը,3`溰΋`UK)0S4Ղ ZUKyªt*TyU+ѥqUUޡJ!fQ1ြ,Ua {Ց(a5ÇBK+[ g䮧OҦYJf)miVLZ{?g*:;(4w Rb3pWYQ<IZ^SQE#F 5̙T#/`19`a&X wgQ:bR^7Ky,ub^ ̈́`I C 6pi!cLӨ3r' 8 ,gSeV 8 iIR>IaNx~.Nk㵶h&Э(4) VPG,ĵ1՘32ԁ{~GuG<1_xysѴ}M_TU2VYT'WR-_prZTH~j)5Zj+N(/y$!JYn'x+jVώ=74 BɘpȌS5!) qŽu%>WM:gnriU!e[6jג1J1^OM4 S)hBĂr<`!F! ¬yk鼍. 1"~D+TF3%iAn=; uq#g2|68,Vhg,}$2 by # P%e\FT 2>g'ErkYrYE*OSZ>uJfҮYkVk֔J䐤AF&6nMX"[ aA Q:gC뜍,8~EGUɌ%A)Ӑ{ǡUx"{t}\R'v)&\?~1|4OE]|8귬4S{g^pEs;ȇAH=̋/'.ߘc ׫q}n~B~ ~K[;ٗݓ|0?y}FlY:3Xa# C#CUdW`%|\|`pcg*F W C;$nYJfӠAN4u  XR%]pzcwƈ0^?rkkcnqڝ-Z?-uzۛ?wGZ&ؙ\PZW{Bpɘ$|Q!([&"d@$ i3k}'x.SXI3ބA{3L@BzebmIDidۓl+/${lF'/d!9*N\9辭تd=pdT}^t#b,z*3&J4PFDpCe(4!L;.@ ZȄFQdxuLTb=2  `db"6&"AVC-py & ["1J()I Y8VV`A-tuvu읬,v {]m7{bѭ.j$.SFR0gF`Qɼ4\*K7>itpJδ\1cZn0tPjTkYsgH[O1TsT][7+ O ~>ٍ11SStQÇkO3w% |R):K:5BZ-،+؂߰LDӵ`UieVRɨ2jшqM'jM~p%&Yj^L+w2ř(_"A b(a2Us` DJn5l#qF;0+Lre3<~](RF]4Sb*'Kk73,eд)IF1½RQbr2H& t(QYMBsWjTζ8|ٻ6$W}; oUe@lW["m'^jJif"Q$HdpykJ"RM gzokuu dP(⟜m4˘L8GQYTZ0*e<oFKJ|Rri?]lT^D ԋT9gJk99 BIUZFpćuqy귒!Se_JX"cI; [{u>=#h ҏleoXxI#jSHJ@A!Yk$,AҎmmt* Dm5:̕=ZPaPFqlq/l /}{0,e'0|!(9 XXƊ6&cZt p9 {?oeVl&9ؠҗcKZK[Jog -꽰Mb.Mfɨ2XĿ4H PCR<^I茪JIf+**B^Dk UD"V6o#^;>;ْV$1w<\.^q3tx2[~\_Ei!5A5T*rL}@w1 9XZ} mI6㼭q1p[<my[qZ,M\1ݛ7ɫAjF O:k++vQ{g{ä7˃7P>^VaScaS 2Gϖ)v hϓ%`WY;SuQb]m7 fJ QuT֦jl/mhzz~({Y=F-3=N+{BۻlGk3jњ˭?&bt}2PVKSN6.Vuw2*n%A 32ۑE4/>vd#lwl7qvxSۍ&|b@}lBܠ}[ԭ:?veJJC9cI{JA} g$#=&24ܑ$w$#](k{1#w+|₯̘;zytD"P zh{sx';--MB2WuzVT2k]|@Ak29PYRJCܞu#3[ 5ݓlFn;rc⶗ofg#mGn;rq%|'vܶFÒC'wNʩZ)>;] V?~n?>\^+ힰU{OQv^Wthc-p$W-9:>R9Z窪TST4&VDG8#1۫7r$#HnGrKr%@Y:A¾K3|ֲ[\;߱9w.ʛC܅R}'A`=v;\9Puo[{@o{ۻV.T.O ʗ>R"gEœm :b2*T)֚jL`6zm2No#>PE2#1Qxn;Rۑڎv;_ǚ2xZ^V-uNo:[3_}V$F8=z^;XyO]oo$|Ǥv/VF{ju=NƳ‡Cg_ <ȭVMi+ztƑ!R :EhF5ݛtFZ;c|0#iHkGZ;vrBv/ZjS+s^J6:lB?j;??HyP}P_TB { PYd봊s.`:u&w8^R fla&qe&BM͎IٕzH!/<[vvz6O'(Q_/&W/s7ۘr=[|:kv_a#LZڒ&!_[}2dTWѐQ+NhX;҂BJ|{̘ $ǔ $cdL 1AÁRdI!Z#ZI(CHUu.X>\0_ӃHC#~S V^zyO/ @_;QB{g]H;eOn'%Lw‚OqvgD OO~{'J?5{,\|z~z[kZb|=Sgō\z!WOi(u (ܑQnYQL_7LY{KkvF%NņxT(&XkJY%YKr<,oB̕6lUQ {ާ&_gWKU 9.8= ֲ )uYZFmCж()1t!R+a z| \guX /PפcBBo$r2Bw˸ c95d"uChb3K6r%nF_tl2',O7W 1 N 62MDMףLDWʀbw)EXAq@S+Gf`FsC{G_MrnM䵂2pڋm&)piLrSdKjmn6O?xX5f$Roäƙ B8{ d5K'Z[ZAnӛQ.J/ֲl0Y։_EE٫k#w.-XǜB;:(v!СZE0 Ub.#ژT9BP֐S6:95HHɶZNe%.-xfy/i`.ʙj0@I1Im(YOKZ2U,1L,LdX$0 A{Q|Q-/ ?{Ʊd /gu|"fs\Ȓ,Q=E=PCi#NcvBjtթjagI+(-F͐cGYg|<ޖՏ~y7QH]WT-vN#-j0SC[# m˚\TBGpxTIdL/FlxFzSiz_({P[uՇp̶u&|Tb]\F۱O`h=Q.!!J &ЇHgRZ=j o@(_]#N XmO@M^'n;;›]Rj$OJ5DS h句%@mr FؗP4T``ҵMCj@uu$of04zLD(QeLi~˧WmyC &Sf?._D.ԥJg՝ciPHYaa5´uk"HMA;} PXG4iY(_zYA,@uW1XkF42(5WJbr jD-"*Iz?8S\/G zkOSX9'Sp>W+:QR<1C/T kce0.42ࢱ!J"_c]#]OIjuZM\'|I 3֔ۋfwAJݔ6\gd_&/l 89(PUIljʵie3ҸQ=Nj<2k9R{doe΋:P+)۬ %BAVa!&L*VL1`TZk(9TVT*t-vz;s'H5\ђ d\Dyoah,Ps 4޼! !%([ QTW \S2{| ǕLdcjhP3dQLi̛~7Oed8jA@+U b:9cT6AF˔\bjJQIrV7+N>ZʺS> ݴ-%ɬ{`K$6=UɠUGέZɑRK'Hi18ꖚ<|POrN Nb=#58z>j޾ GMs;!OZD9/a~S."Rq(x%ϒ#p)K eT\HWoC5`˃(^Z)~?,!C/E!eH oJQ#er cqyܾ: zKU|Юnkz:wtCwp֚q`9d K'4؃d7w{jԆoaJPd@h,i,[B2H$0AI!Tcay޾H`if yəK)H)ERG%"6,斫|8Adw}䒷49Juorzʇ%T+ :VZ4~Tm:6**DdJܘ~8.dp0:'؇xκ(hTީ*JAZa-o<+46Bl~oL/Dv۾,ԣj 3.w1jTd*%\LEm%h\[B6&)Er6(&kG(xa򾕯 mDQt=^AM^!^@pqCs,8=G02kk-p6{(w]j .ՒڣX[g.ID5;^f;*S$|-fX[2!qX6h7TB(|Һd2'bE -+ l~0j=A*X3qԜN8pW {4>Zb1۝>u]k{GΦ`S9tE 3{( (,5S+֊IzC}Pyx93i%\&۳$Z[澃= X p,(x@lTPz2[V (v?,dLh3Ƕ4߿mh7&jv=MMc`p Q>)neGʍzf߭K4z"NG-#W,:D( gyPŵ^r<|V8[s UWQL )~p}rƪzHF9> z LDW6)+MTC k(s\k4J@In fI.%r4 0*j֤r+_8D@!4C;;X612)7WKe9+{IǧP޹_ԗcl`R(z43{}0 &K qb![lPK[B,TU{aۓSJp TiL:.,748 ,pXZú ,n ,-Xx KΎaN K9ܥϝd>(XG@oS16gQd`Hm>ID kf$F-]%`{sN^K8!eFcN=XC\6{g(+Տ/f`uW b`dLMY0F2/Y=_>,mzXif6˹,w+SMm\f&r.X˝M8F263I ]GF.Cj.W=ƒ1biCbM99?[8mL=˭!#[OoB3B2\6xs36X;i|fv`'uvqemjewƛt17љ\RJKM%RrUhD7ʓRs\BțXFs- mr3W"D38A{n0mP+nٍX\H5|Us_U|8 2p/vEK'⺉AyfXIAfȸ-NrFWgz @5a]YwyqE]ZEx΋įyawb,k5N-N/z}q닓/rK_Ļ9q~EY:_,~Y/.kia&yy JS|^)׍x,s)^)`8챩<㻯E>LǷnt{|FD_b|v~&{|;ؿիS̼cj@{ͳL\˥ ꞛl׊,gWo½sn|bߝ^W'_e|MrW?Գ-,Ҽo5 o5TQ'>\ߖ֞ ZwBc4NZ7RIV_ն襐 ]RRf&:ŗ$,Ϸ-'./1MmuVd1E'o7E # ,mĿ7\_׭Z^;9a|aj- oشJbsbmتU,AO˷F(ػq$W~ٙ;GlWDllwlo@%G_P4 emKD 5VOY$)'[ӁM[9^[+x٩,!@ [-f^yuA$VRQPzY3$rp_ۏ-5d #7L(JeZY7cY9 ] @FC/R:|%VA qc E}\Ug yukUyכQ3[=QU" jv:CPR(=L`V4!{Ih顏pNoTo9vP6\lj2 ֐Rҁ EA,|&z3K]ݢ!LE: iYI -ʜAN1892ƯMe_(9^E<|עhʹ8\dzן7 j{$N7.?ThG Ί4fH%JrJ{0:"xZ ؁#096'4u=*=I: SJ1 >%B}*#tݧjn)ߕ*/p(ce͞|,cje>"H)U8^}_8̽xn>/l&/]ԥ*C7%R&ًG3.yQ. *ˋǹh™̺ pYQń13iAEa/&wLY@|AVƴd ?">:ж:aݐ+@j'<{o0CGNR-!\LSkI#. FJg9G( PgTHyp 1ݟ,zd:R$,rm(FEXeT2+lm! l]z˸m.1 Eĕrsi!EIMb>֛9; W( 6}l-cWB֑J c"rȠٱ?fvDisHP;mJ]9Fr ZOY 7\QdUaiH܇.`fF`x; 2R>̃zy$EP/ޟ޴Хl "²$@ujN0g"#ڗڗ_wٳ])AOPJqˏR+]=l]zs0SL o h ]=ן(12AynY pT#v^ՠHJU메XsDqk~wӖ/X>^OO1R~UGB{H(G.3 ku! S@xdLsg VHBݣ2Dj̅R"Ƨ"իþ{&L[_Ǖ]5Yd5;xzRa&ՏoJe}$^KnTLnFwC&ي+d̍UKu~8[R@`T,]q %ǻЇg/nWhgHpIX7Qbh&RWJ ЍMhUǬ%]aV nj*͉!!8RW8sVki-QkH[ŠL*JT`7Z] A|! o?Ƞޭ9}yzB=]y7 ey.ŭpm  acGYOxУkM} Q0A3ew5NOi>H`TO5NCe!d D @X ShqkaHDk5p'$\hExnYN><!{АT=<ٞs'XlDE+`r= Ȉ-$zzN8xSBJ㚤 q!UW&H,H+XKFD('tѴ닃iy&*v'P@RiMq2!rrz7A< $ǔQܕR߇Z/%$BN?33$SL`A07`V NPK,C?61&qYpZ! ?BR "v(5S \Ճ;te&^'7֑ʥJq qY7aߊgY6YcdMY6gMđsGZ9~M6֯WW1(3*[ W{? `Fw5/MXcq.ϩ@C8wpe# Wo܎\(ay6\~P"Iyzeiy?_Tha%K-Ӱ^Q rZ{Ņp_4<6/6'R4ٛ;rZ #Fta,TTb0'l^FU'3TE6׳g8:bٰl`;* 3z `+C juu% !hCR؀@le佗ՑXr;I~|5WU5\b[kHˉ:Fr{sNyw[V YcũA2X\Z$y ϗS,!ԛa.2ߕ0d=;a[` nsq[ʻƈBIVilbh̀ .'jA*$Z8\1 Nqeqo7C$yg:u_8ГSfǹY_B4{{AV25 ܸlAH#Q3MiUQ/)gr)5Mޢ?כfbX7y[%yYAh7i "+VGbvM$(>DŽ01>C:|:j1$s:i-$PiL1\Eq bNpX"uؠl,J Nϥ؃9A2'f h;z/!xpbN\QH9 s+,vṕP0D)S&(Nnoj(EuJ`e$IfBJj#/ >ʴC@%a|)<Ɍ#4Ĩ #2@`2IQ->D崲@72hw&t ;O!ղo3U;mdή÷g AHo7WSD:Aȯ0s+WSrX_1ZʑsTR֑yA w$ΨY)wz[%&Ӆ8rk4^ 98>lOX I)?\PLR<4*?+bs|*skN o-dVj諙g#n>_]BGgel2ȺSު^h˙ON1C2OXael:Cytn VJlrJdpF A *ǠxJ:W0|ٹĿ_>Nb| ~wLGM}77٨<1ミ-F/Ř u<ƃߖCl.Z_xz~B+Y 49gOogpvy Tj*5U5J f= 7˳؝ Dc9J0˷v9z Y-ٴ_#/Xe~JcԹ'w[7TezJ/7)Mk["xw _'[ndeE12Ss'6C$\:c:=!Cl4zw QQp`a )&pd{@ oi3i_k9=QnfL 1[ͭ@rxnPGzEb6'mV)nz\TPfx \+fSQ8PT a 8 ׾g)%M@+ Xf\4r0_+zܐaST,enp-<Yڒ5*8x RUʶ2̓;3GukՃ'.p~NĴ|ŢK`ѧ4WgUGgk} DQ4 6^mw/Zߧ{pCa]#H3Bi` 'Zi(Z i,YMc#Fl:݂>C9bZ{ZGL*vs\:~a[1Dt+K jZW8 _T]ڞ >ĈimJV>bZm^a*On܉Tԓ7NUkJ,*.Wk6`TZ$T: >d_9^P² g<t }vdyfS'/[ׯ~N`FhYV>Q*Eu8C: 1Ғ'#dH(O8bvK?1J o;@Х5#z+P\Gu'`u'ӗ~w{I0GNrT3BW^? ;QO0mQ"Z;.=Ci.+>_R[y)GG>tq%Bx++t2ů d,۶IY*6*#7,^rч?k4sf,+` Qys}{T)MFRd_}țֹ"Cg yZhϰFlQ\e,TsZԥuO4e=\TclVݛ3B  EJh.n/jdخS_r^i% XkY_vb8GP*N X.d> oU:v\Fc? Rށ0Din(XXpp(0DXBNV0ƒW}TЪ{okڔ]qThTn#ƪW[f-؂&j>4ӮZ"Ov}#$G FH:<'myrB z?7__(7qCA7 fˣyYCGg\`\DpתV[nvt+F}j,bGW|ߎ: @|_<%i4c2`\eziSۥH3jNƨr΄3ǏiV܊SkkeBB3ִ^W :wCZ&:tH+n] V|oVlxm խ+g{qپt8h0CvQ]޲4U#Hs~9Hռjwq*Z8ⵐV%V'L!d ub" )D:dcmf%mE!pۊXMULiSϗ#$ 'ߎs0Bb@""_z: |ԔHrd2j7 %_)lYr/9,L dsϚ%Q'x`Rx*>(R3kRYr ZNK>`I=Kiyuk p o H)>lp*ł[ãk{ [?'~eRԘ5M龝|:]h|!D=FF U[J}oQDS"#ͤhv>7콜s]cb!ǮG^e!S4 eVm{Apa04"(ԄO"bNHRܗ>b*2>!!"+nCARK~VQJ^CvQqKrr|"$DV76)uxսʹ=/FNb^dH-o46J'gIx|v4K% C9<5Nn:,\j5{GygG&k&`aŠ&xme}:I:0tx xOvSS L4ft9dqjly Btگl [|tx ݿNA>Mpd|tN8H.MyY@OvjtIO0In>N%kGP^.&Sh*Em~v0Dp??y#N O2̃75^E&UGPְ*^L}մ8fq T|ݸ~|pwOƵ?y?Lb0'qz\4 ݇k4'=? -xU>wB GPz$w#W+ g4!RXxwp5𭍺?μw6w~v>Ξi=5|?~ksQ5/Q3w:{Yn|}tgOxl/_'ׯ8]<{}\Mn7c~vՓO^(\^g2 j1Ak7Oqă r.9ђo7IkeHu pTzOtݟVn\?Y ? ??Vwzhi0^0Gx0zIMO/!$8wi'3ܭ_+\84g0";sVpUv -\d.-s]w-X_簁yPޱ Bs' L|;$nl痞85~n|Uj1g ΕйDHP5- C 8 3HkD:B T/i# E#\Ο5\WD *V4V&0Ņyr=ӬNڳJ;W~)fV*( l;2"g pv`qg}Ꞁ+.z|'O9c2{*M 栀O%Tk@ડ%FoZU%$Q_,-p!=d,(蜅Dz(!G"qL*?VKk S '2 )MN=?ytȚnWh룗ώG3[sl)R^і::>r2ޞ3] O]+ђ5qm+{hـh !}sq+8Q̈CaD@ZEf @5W~x`PFAK[λVk  ޑAf=SgȯthMl>ޝT@GqR( x- p7Z4H<"iGFA`E" $]H[B_>W Y$vrO w=Zdc2 =crYB}ߐcV+fւ#X=aȧ&T*2%fTHCA15pJE^ǀR{vqU!HZk6xn㹩+N=e] XI YY:b*"[U{7.x!kS Xݵ? Gvt6_ ӷcFۂD9w&K j$b{I:_L好 ;R0#zIUw(c8 Y棂?d9Tuwb@/axޚz'*PXBo99Ku}֛c ;N^N@+D7Hק#^ΐ]Yv|U]f>ror\dEtWW?;|_rNjd9O lLl^݉;x²yX*UChl=G*Wn#ϲ k~<8҇ !z%YcN xN=2Lu4~0eC3/@δ*yƻ!̼QB.}; :i!Y[$6V=~vU!:%{昂䧼6slD`C'ߟ; w~Tn]LQYxXLGaLRro8toXv#Q:.:sЉqHN2"[qEa:Q\Q<mݟ,G"̴bF.ڙ1D()6+!h)V~", Y҉,:&G崱)9c!ݙN4N?+;g@Z/vzp0j4oFjfS6řO: RrIyb)ӖLJD:8:d &&"$:MU'8`o0<`+jgMb9åbY#E ؔf7CJ[jJ!hJQ٬!RDTvh Ev: :Ma`A!vqԇ@Eo`XbCXcC@1[Xz3N"ND;8"7{0pda *:,O-mn=DYc)x奠{wp?ome1C:/~!/VK-Xr`R5B-xUj=-|H?65cSӏM;إrhn@)sxۍ&O_:ǒل1X_im `D-9lI$XTў0ӭtN`cdEA+SQQFmҘy0)zZ\)7"mq n:F';,~vp(lK22=(b**&cHP],(ЊA 'Д )v!g C:* k'z$;2(d]3$X +c%z3d\7LViRH|H| f S {Zž4ž. - Z!U֦&Zhmډ. MMVl8+TpdYAlȦd@B8kC1(`?zt,Y8cO>qӴOr[[5ުVMjV]6 !ґ ,e)NTB%2> a}1 wXg9dyz1 ,UV;z-.<_f[~U#OK>-I_zx?V8|l7.Xٟ/k˛Feۙgr~{7_1O+r~b{'?CIvͰg 7d.Bx`#8\0Ǫ/m*fc83!IvoϖX:A`;HRA;KBO`KDɮBм -h'9gN4͈୅ ktr>\)4ZЄ@qe}b:eU IآUHR 5Ԡ ̪Lѡ*B-=DEExdUtW [ޮI>f_MfR;`3E"FE%gTʄ2:BVT({-={ڹAș pCnf)bl20B#BltADûaJ@Nف/.lQmd IecB)J Xy XG{J=ŕi=yHkJB-}(mm3t6`4 3tVS'@Jfl.pu<.L;`)#Zk:X*[Z1'Xj0l+2ˏ Rpli}IuQ%¾ PYuzD8`c5$\B򑵱l%~&M1lRQ:Jld knPo" =8%q, bymu+6L zҥksQmVtE=UEǣҘzPbJ78Hoמ9( ]wSQvub+!CF)g'kȸye&0d0o[6qVi}^g{NYO*)\m7pw2xyS+r2)îNn5eh-t6e2#m(մ𫓇j @FVݔ$a08TAyҢˑ_8[YV)7PЁխS7f.E1nDKH :|k!펡D {mm4^H^)`ǨH~&gl/UE/7AܟX3+OU̒͛|֐]֬3O:}҈'{'/ڝ!|1@#xf˳}P'gz5`Յ}֛]ʎdypWܔ+=xֹw^BvfDKPKVCۆ -g Ҝ̴lhFH>ϖ] 2%)ʾ@ y&ݪ~/` BfyТO(0և\ !Jk'e۠f7z4\ot̀1Y҄iFy={ >c潴MS)yt) 4/JO->XaH5*@%iU~Os;vOwy|] @k}hj߽Zl2#dZp6!?fa'IŢ|lxpҕ)i%'7Wga o'/}z'/}BWlz7έ6<Lق{E*^%(٣[QXb\7_8яeׁfq8KC:c4 c^>s4>_yoȇh(b"-Ø-1m#v[J zUDv2Mh$#RB +5LfX-rlL .:{Gu.ؙۘ=^r ȸsJ"eR(H r`r}H;6v[?~=jkۗv: ޹DX'PL~;A~QNPl)NxM^B +lU jD)<{V¡ xѨ%;'< QötkxIJʪ]quI,Ѩ-c,hq״m2Foϊ5֎ִW6XeV^+'&FT6^KY(᭽S×>ޫ෼:snVN"p:,66|-jIK!i#rcՊ[lH݊ۈ߰l:PNFl-rKtlu8mU'J <)fDVt( gkUaVݢSAqNjݩб϶^6YgZ='F4βMmn- Aв NeKowR=ax Ċt~CDf[3jkKF*!+$b[[wSK[;wΣXN 2SiDZ÷׿LR*Zgosv:~ZB^]&V5$VMp/8 `B8-u+iwFͲKBf474Vx=ܙv#{JTh5l%-whb\\SkNCˌP6 VM$PI_UڱEעIˍ֒ ʼna>mHnz ґNFlp7bӶ#[Щ$ʴ.f?ot{^ʨ=d@u"9NR35"{F=f_Os,3ީ$CHi! ]'1J ejڴhA)dG7o˖_=IM3wc//YIOϸ7o?ۛiv)Шf/Eɼ9Mn$U%kWU7?x5OιS97q]kݏEr_$Y|~`un >>WWk\陗R6L5Y?TĴ{;Ԑ%djvw2lz?Dt:a/N"CϔOl}:k3X{#tO]J◱3>RI"AE"8$C%ކ\;E#fRe^Y7RgAg"HT㝏x:cZrk;,:8O"4u o'AYQ= 3߀~y6rz5{BP83ƛا~fW6b>2eU,>S 4 "6=fp-CpC\7|qwL?Ɣ1[Оf6/3֥#~t}鲴f])!: bߗkg;ҩB|p}u6r(_`7z ؿ$`B̉&fQ7Z65W!sySmZ#^qйG-eemߣp =# [J;}66:/NLxkԱDY(Xب' % I)+ݓ;O:yS׎B4BQ]ug@t6rrY>niDA`Ќ!{z:R@ hLS\g*(` K&dFhLbȗrmߤr'FDpц,* PY@ (.t *LtC謀.6X[IN3F!yM+fv;BNk( ~L30CtN2N`4AaRͤ\I5dd94G ;2FkӝCeVCAf-;jܾXH]slyn_?[{4'Qno!l;m*ZcǓq'˜!r:[1=)qjaKp-D*QbB"Z -)#i]k{BJVKfv%XiH(JV"kF5 uϤ1B.%#QB*"@!˞Ǚs|cd{=6D˲#;D\Ʈ{{ZA jY_9rY|QYTiqu}tp<;{8 u "y梗1r|NhI/ -[,HBM]%Qm!l8&[RH9خtU -P@-Xy\W#,nV":7ؾor alAeB;l5v`lMgi"س-:K K$l =:TSu7>[{S8t@f/vمR#qn\ v vV^]=?DEVK4-kؼZ(io9{N!`1נ6d:`I_N; я#GcE>>e D3/+Q|o] $e4y󜲬HCOSiObh\/fHKFİ o rR5$ĞI z>)1,!] V Qܩa\;l<@gԷ@mKE GCAeTP=z:a.]ϡ&UG*%&.˨` ͂oj^ϸ`IqۛF^Oƾa+TsoQ$T4ѣ4N.>9ϖǤB{fi}iLZ\zlzzDϺ?콥n{<yx3C :pVϣ%8!8bHî7>ᷛ|?߱6?Rl.Mylo7mvTgWْ oBJ:;+j$T66ݤݝ@o%Kyߧߨ*!idz_ f>{hq($-ڗCsaf{CQBBr G*l*^db,~+' :p&S]vd/.nFTyΐuks`Pg㝌Qfʍ m~ASf-ustxʴ6tk֞M=}:7n١ogeZ﨨|J|:U_󌪪ذ&P'mHW<îc,vm홗yD[dE%U*lАZ<*##"ΈO" Kյ AGuZT#a8MS t1KJ^م>L,*'5.5\;UV$vD5N\^]W5^ A%Ya[Ri*uR[׭OSKԡzZ:`r!NNaoN2!p͟98??_ ;4ݮeuNμ&ZLBhr4ن58m/Sb*Ken<:$d.DS%$ 9i̵L凟  ALCPO.Kp,+dѩBqI&mZ r[{/8z2ԑw6ה5 e+Rhr0ӿՠ WQm:yLlD!% I!MUYȵ7Kg,vTo;[BG*_+l~2pPt6O)7y|v=X0UrIWfrj>҄_gčp fpA_l\F~Kä0݊P<ЯEtbu1xLE[۷gwoY7?|8x.ҷaM f(A?|kXlQdbΝ 1BN'z1ꨒҹz&kʃxҍϣ[g KBh$7נ%NJk7aJvpG+҃"V `QX DL'O+Eu d Fk!o8#U S~̨Rq} fJ=`ՠAz!ONjq5߈/МB*L82M]yf52Ozu}3*-jyI28ºpm%lQ1!{{Hߦ*=)6 ɱ䓦8dr7@εUa% xר<MW؂vY^-T#>Ԩ^TQTوuF:k;/Ʒ}%l{Loo,2+0.EΓ4]b;S]͇i%Pe^9_ҢWqhx0TGD1H #Sad xA2`Ak%6(ola$rD7$2Fw\`+GӅGLF3!n??OΈjĖ@5o&袎ھӃ{c>u4,XbBjRE0!E% Y"@1堏JF=:+%J]_ٕ N NYaP^ ^%CW2ҥC`P 'gN5518 o/ ?~EWFy<܇jR)KjZJ8nc .5c0#Hvb7Dz8 N;8s^!L#Sd!MP6^f$,1$]!Vu77gPO?&þ#vMp6BR K {MC@xՃ=QR 0K1,r *gD38ʡ稌2H%L^<$N$Q{> z* P)rP4p*7hxRXo8P!Zl5s /F.P$e @{xC&ίYvlT_Ҟ!OTXYBY&Pe(\y TA^ˆ\"\|Cyn9&QzXxhTJ-mnWBi#Pɮnrn@0T߲<Vx֏tݛkɓ@dL&:ؐ*R2Tքjgջ}㚇*(ߠM=uA!ª tWB*;P@J%~k~sRS\!Gr!GrT,\%*9 T(֥: >a18Q6:hWι6]Qε mG~L9vI.i%RQ]R?X6M(Z}Xѷh_ V b7  q4h*GӅGŗCUxjH8)8|Y TJK/9QٌRfr6rΦAi69A t(WZ2@-UttMS\.Ԉ.7+cQXK~qpU\B`E;ZVd.EJFr4f)PtF+r%stP_<ؖ 0tzK:#6F|J :nw%qe0f}/$@_0`%ߪ(D$*qC5! ỌMzŞi*˽GFq@\"T;~~ R?=3LL_ɼOIѨ(pBg<ó֙#$iY%"hK,laL9"*RKA0 Co![B1z hi{ {cjE<;;%⛃T؋7g_d1Cb=Hr׻.4Բg^J2;>3UVZ%ӖgX!.PXuSbCBE8PÏWkjQ~ Y 羜c|~UZ2cDםXzѵ Pɏo@/Pzz%v$:`sbgN/SGVmEQﮭ7q2/q3JLrzԜSc63ph R8<ǚ9f My1_c' 6ty6u4Zwr;dAp H&,-%m1~6ʶrO KQvr1雛߯k D k*Ik$DU'"̩L\ey9aeԖن5VIK&hVe4NſOb`M%U_  =c 2hsu41׆TueN:9妉zidR) * 4QIQ|D(=g?qkCکCWg>M]q0G4*McǻۮY?ߕUS hyc`(IÿȂ\R5RRnO#haCw|숣Ȧ.8UWvA%Hx$*Y-wIaU n< Uϻ)4`qÇ-0\A?ٛB^Js}wDQQuV6u91ξP$vHi$5I٪GO{BePTоM^oS^oӆطz2t-%xWg O Tb:vҺtTWB稞LzZ\L~ѲgzJ1pwu 67OgV¯YamgV& Y)80Z´N;& 9 "*&`wZ͇m<6B߆_]ޮ?;0 z3*$NI툐H!tZ_Y 5&iւ. OoGOn4OyNdnGOn4O45m2G'W8 Gp%st;zC4NQCX"ștF`#BZGD0T}IsYON //&FǰK4ӘZe҄Ш4u3.8 N+3Y Ps^E;3S)Ϩe҆dr!ƻhۇb+{xOF:i8?kNPjC2C,*SrU܄qnx\.ϫ5lqaC{|ц/nP-ف%Yǻ(ݠrG^0cj4bG1m ֆ)O~{& n{=}E={8tr64?eZ2qo {IĎyj+GN)&4.?n ܎Ә>9lnxF23sdd=u//f1#i"#X:;~XOz7PT~nTQQq9v+'0@DQa }Ә 檔Dr,T:AZ\~mߛ4pZK;!l @JlKcK.>۵+?~r񥑔a}oQ.2~IerJ+0we*j!J|/d,0RhAVr)gYr/@)D@->EK++y7sxE%PEؼ7#b1^gP t89Z߅"Hnykъ{$##~jh8\kUnWw3Tcj2.l*+c)ݕ-v_qT]5p&(FJRS$'JKRB%Z+:QB%f,8r_r_[TV~=v3\e SvG7r, r R~n*d`u{0QeDRrentJIq7dP)Z;h |Cr 5 (-/?L+hӱ?^Jԛyf㕾(E42SWD#)}Q`y˪穓%@EzE ::l>QZ9Dbf?_ӬS4(QwgZyʕ@R+aj!Չe%WN%4RKTә$فP T|,J!ZE.qn{C5Sid )Jh/r@ԶD2G(Dʆ@!7[Ztlf#ZMT烈0,)%BaP! :-C; hlam\$C Tk r WY:kE{340"c1Dƅ'uLp/8h )ưf*^=ŤrGVlb WF`J K%֑@.t'yXJI , Z,!C,#RneŨB_*Mlo9XJnpX:!/\v˻~Mz*,SZVc(p24v\>wX]`K+qWMM}ϐY9PE1E~^"(>wFA}6s7Pwof8:Oc>~~Yn~x7_/71Ƭ׻WoIɌݻ/{g|چ-ޯU >iDF*0U+!/MfgL㥝 ZMetsܴ(8Y0D t:bHc0jHxFQ@h#E* YB?c y PG :T*4j@S]R3 X0M9=^'lRp:&2刌@Uv&i8hVz‰wVAi1;#Qge0U2%hXIp*8) >-o.f.js8d3A !\KՐ}u=Re ~>8G Ĩa$; {&uQB r: ~ lB:c8GR%n \TuUQ$ p_VF[t .B QG&tK\em AHVM+8E{D%YJz- -4b&Dv#9m"Zc[Ƹ\Ht oMI@j䛾k䛾by̙rz&:- :Tg1/ 7rc |A9S&-͞mzM$ęxL"qGdM,4pJIVYJYB]С~:Jv5Tl4@5Hpf bFv'xiE/ka#adU=JD6|sykivmK ;O+s9ap=L5?VwrRXT;0D2ALW.IYhnJJW㹘=ĻT߿Jefϳr7rPT,kuo ou1OdPC~&aY+2]Y[ q9 !Jڕ5"B4Q$bD4CO) ީ56AyE|DNT~wKZ=o>knm;⃃nrT Ϧ.8陖VZ0 Ȯ ~9N;?5ؙƚH (s(BULQUV?sn2mjǚS^9<lLMǑ#e+g9$S/9:c%j7_ "0JizC9%?FQY筪@o"Lzk 'ֈ2鍩1 9&>}tJ-E; ~{e@k .5J{+y-,3MoL_Js[wR+%e^0#1m/iPbsIczK6ӥJj6rMc֢ٱ5d4v1PX.hsMcF%12(?I^zgc}(NLvyp@b/EIJ{ūxEJXH|_L$|:dG@BBV1YûJ @$^Ǫ'/,`Sp  6n\2Xۅ[HiocGYqp G;a1 1{\ 2TBH0 h)/mov>Ö} _yG(';{YmLK9 87E[L&+f ,FctM$gXs7x"5e\ D1\*1zzA4_'<)Oڝk1$*!v6L<WyBDXE 9Rߜ+ &b1$DK"*7C*M㄃ > 0$} a*JLp|z7o1!wG X\jQ-&$!@C|3cマy!x]û樂#@ɤ:ևz'jAxv r' 'L$^__|)Wg E/JS|T2P; .ߤ_A,w&T؄#n~ C鞩z Ò^#:(4C.hDquAAߎ0.TgpWopQ /8zv*\!aKC/c?M5cܹ[F0R;{ -:ApP*6A>v\JNmtE5t(1M`F™Pg b@`tUQ7JWL{gƑ`(&[DIyhzgk2{Ђ q2dxDK5?_9\@x-=ڟ݁ʎ1vv>e~1$(<a(@7ϵхSBzH)W$u(bHx ,Ͷ}?[h"5w."GV1 ު5 4SONacP쾯O3>fwf|3=Yϊ*s ˥4Vf InҫXy9\_3oj*' KAr1 F!fCZP4#X}ѵ74@Zb;T7Kf%Se.2y3ihs} ir$UөD#gtѫьKQ!U5m|B°g\QH: 7\F\yziq b"#ua-yt9;! 0[SkkTB&T砘\) _B :4:П]i;YK^?| Bçx5S 5(!qVBT;`$X-.;Bo*3c n2j#7moyw[G?U-Z ݬh74n*POA .ڬ}lnoI9Ҕ-"U4& (^UrDs+Q1Vaer$J4ԅI2ȋAbzp15bi#d`7 SzG7O@vlUciBg8e c1b1wP&˹aQR<76Ȥns"s"8%:eůgU~8-]:4 |^:{)8{g9h TC(^u7ABc^sKAsӆMB@G;\_d.)=rDR7V>ThWғ-_|!z/FcS23dnoTŀ|s\tF/*qt 9l8t4%ӳYJ$f% ] LkhvAӣd|︇}4ږ`~f*WE "[_"~p$Vz&A_s^6ڪcg~:gmCț?~ş;'k H”!y#l7F'a0!(/H{8̲jpt$5)q*iq''%P3QUbLdRjAȅ0YKgq-1oܛ͵ J׮:[`+ЄJzm2n3əVԻzȢToCd(ɯ]}P^(=+[>TYwѭr,nv&l{Z*D׼G u1+f?z'd`J 8ܪ!s޿-|_{MS~ aޭ5[K[n{.Ͷwt[SLhҳzˆA8ZXA0nLY )l!gr 1=R~Qq>av"RO2МkCnpMbjr!I;]L=9(t u4b+'cFE|pd.6bh(2Fc&}R3t(NӒ;, i!ːBR4\AĴ1J/8w@=d$|أS &9 F~V$ NLdB:gsgD3 ]#%BSY?OF mB#F@(r9m71![H̢_ &x{dOVKd45;?kGRv4 wCBCo³-AKI<(ļC kY)ZPS(LcV0niH9S/բOj}btduxIbQ(h4;_30Q!^BJu%Ĵ\&T#Eƽ9%¯+եČprTWA5=/bB7HT; 4W߯͏tN K۴0kRp)SMڴBR{;5%$vjQXoܻF#,*tYJ5,K􁐶>8ﳨՇu-YOpsq+J6w\3ɐrd]ƬS uDP,xW.`ٛ c6~=|C/m5, W r.bwy8<4}D,3Yŧ n_ cS[]$fo^ |'p%9.w:a8gKye* sU bb 3Smg 5"1M48$mȥ3_2\0WMS,e4aq{mR*l6uwiqT{-f׃O.6\NK-6\>+B[).Xk|֝hS;M?֪-%0١]R<9>فrY,.£uNV17}I"Kжk;C$o(fvSNbX8 φ9YO ,")d}!=5emq:~fWF(Y|BCeɭI0H!A#< # L{1Js$ڜg0>TNSByFx3KX*s@ЁI R~/߾m`l:1VYj'J~d?07$f5Jx&`CaZU?ZfMѽ %{|o_܄KI_rPO]*D)**2˽ ހVSh4D]١BEr/ӄ Kw%aTV횔MۻX6btr;ev"@W3RElHיuH]OÔalMX-J[pvBҠjܓXB.QDD꼦 @̷; fxdz|EY3l9O@C<OU7a79[K|u>["_]$ՖAm4$ʾ~h-*%gT&pZb5>ށQ$ZXE{XF%D!Ḱ-jqRcWH/v2p_XIlŒz?6Ti Έap/4GǦi޹n24c%ڤi;-= oGQ\hu9PtGؓd1Yv gF{=f;jH,DL"޺$TTb%$\L<-uH2f^'(5:"Yiz z{ijS]x+\gW4x0Y5#*T""@2 S-Gw&'uX_~~E].$ǒLk hֺ,]94?qۆc8;c%J֐jt\BbpBPK.m޽XT9)8*Rqpa/m;q0"eF.QSJG.~ue^(DJY2!=je|KT"V%A`Y7q@Sdgo >7ce{)mR/%G÷ loPp$9Ϡ2#L+2α K푋6+|&{WS`I(%,k_ w}*D4LH&xH!h_雙q"_S hytB}/!9mG.zWu/|nDurwesUS{.dkk$DNssr0s$C-!T %ռga4R^,4;m$?lT#c[W`"iuo Hj= ́ذ*+< RBJ 8d.kT M@CBAsbephcTv؉:Se"0KrwCiyLbÌ{3iЮP|%x$ʝ(;,*)5<~,/z#FRws0s,%3p1HIN^5WoJTnٸ7tvțIJW٣3h=`de LY Z}و ƥ`")Ne>$sO 8.jYܵOi"wuh;Uh.uaz^gpF{2ŖIlcā?KT;0yEsύ?<{>wt$K~{>> \D ]*;I뫫~w7gIx41_@wwO%Csw>ųtʴ˿X9zm]/$?X8Nz+]B?;3ɛ^C10?~gR k;7^?t_@T[+U4Nvz[ݟ.|]/EOoߟ`h?]@ϯ6oF=('\^Eɷ롕'Nk7?2¯`gB#H/VI ^ou Q/G{_8PcP!,A uo„.7gWGoJ|>> FWoe7t{I?_˾{G :'tb0+Ndw$\^ד"rIERH{sؿ8!c#>q@6:|0nW ܔg3$uN*Dٔ_e04Evk eZp2(o<_8fv2zОG.'}^'"3G=cnd/RU)o!*2Gl9n!@T Ȇy>+> 0z|kBe?'nLJu[a;5H;%Z&mw1% z垬]YiMz&[}FpJqEgwb*s]MA)!5%@J/}*znjl= u<}4uae'11HbF=5&te$˓v ?E 4hf =A 4Ch5nwnb(˽X3=:!zv8 XJݛؚVf89GОֲXֽ[BQw꘶[K{v蜢R/2*$:+v9{ R6ֹjx[VgG5ӭ)9Nv \{;G)5Eqȴ=Z& xVVT@t >T"V}7-JקĦT53ǥh5E]lnTxup\lzݏqIQ̤~ZAM㋢ ^-_6 Cr/llc)dSMCf!֒mM̻F7}k/[㙣= 6ӺɎUt:U*Yajւ;r;gj` qy  + 1eO6v$!b$$saDAF3ʄHFBTR.|)V6*fbޗ5 cAUZQ^.Y!0WhM:)*HF`%!EFhN|p! SZ~UK+,1| AA!>26nTvE+p%c@BF1Vy%*[Q# d% "D<5{BŴ $Si` iRCR nB ȵȝ<zԗKRf Zsvc?:0l 'i4 Շ exՇ"|)]eA{{IIQVMr,ݥKݤ!) +yh/_{]'N'{wv9UU4*!L^m6(G{(Owq;z4;ʹħT;|,'Jt7:_"*eO!?GLK$mK4WPm"ٹܘ3ħ)56i"p"mśzVaWOF1pncWOETIFT"FJhNKЋs,2[81N tZIn3.pӲI)CzZ#ޑ!XSb%߯;d]QԻgZa] i| Wqk4X/EiU3E5P\o#ɯ+EH~)>V`Źn˜. l%HH"H'GfIj|{*S.BCLFmb\.a`|82jZZpr7^"ߊ` 5OBp%jOkŨ+vNs}hB(fp9r{Mi* bpN_;qvNK4;riNiMHՈѺ=m*'Pu<yIyG\32\A,Y&:>II^zX7z^4mwt]ZQ.F $\ Mt/(η{-#`kzSm6l7P&i3r\[ gE*@49?313ˏ?b9 YθL\OƋ%3KJQ.WNU( :72BgA/77U(0Lqb-S>n 2!Uե߫ؤSTUr%TA(-c/E~g;?gY_?6T5HVN܃1x뙵?,C#[;r#[;r͏@8̌Fp=O!0 8s%: WcG=uw[.m)¨ehU}Qu-dd#m>c^dz/_QREQsۻ +Jw_ ۺ)#q*sW*W| V^MBvñ堝{1+5pmpkq̐GnD4c3V.T|2'57mG 从R:Yxu;pM֎PƤgcR׍I_i:kQV!5^)BR̙Jv6P>l < QVԷ7#mZsm.ʵP9w^o4ђM ky2p<7UP U&\* e3 x:$y>E !R+i} L}mڳ4^# jŭ2UL ^x!,5 PdiC+1c06վO=< Rn `QcTv*~T @P9i&[?܎G C1ݵ8wit+ue,HmTGBCV15\^<(|/lo n}j"δL^[a4Ǚy3pc$[w) DVύǿD(y`g^|O8dȾ,[U8@ii#b|%ɾҏ|wCQ`R-K* [%ꦶx W&>.ůe}{ʊp%s>"h~z(sXprۛ : 5w{ g.Q2q7vOowޕȑRЋ^&q1|XF/5\~<=~O&8&\?V"Yaa(6Ia.~obހ 8W o$o/)Ͷ08[\Yvi?ʗLul-gj¡!{0YK@j7_١?@ʼbGO'aJdVVA;Y/N9HUw[o$SvoÌ!d<:`1sXqrGvrr .:vٕE۝ui[$ j܍w?6SZJl{HudZ2[ծ(+vk_ګ9킁vZ @3.=cE`0N0t=)‘y?GuQB*At=8 zs 5i3 ۚ>V6NM*u$ɉB.jwZY~HYk_% yK˧ BCBѰV_NRq<ZthT RH6q8b9euHF !*?VX EC>%aBZ?fa%Ēw2+U*'Uc8H@͏'LB /$WH*2TdH1ZHWqinó2Ff|u;a^{``Z /W鳶@_/.?{"-{gm{?lh+Tzݎ=d G|& 'o-eg$ߎ@]>h\t1q^S$ʜ5#eqY}<8%~]5lMT^Q1SmlawX,k++HYh zQ|&͞| g+FiߝkBR F 1>1a Xgȩm$5h.Sp|HM:!&1:ڌڐoU."'g%,FÃ8}02DM + \tw"W= \M7b4'CJ7ܷ?8kT'LZ80`0AL`7wpobe[j[7u乫[Jcnbr[ oٮy=% Dl\|R>UIR8ʼnНn> C!X# h0Pb%8ȡ@׾1'IƑtR1ޜ^)! a'j7Mt֜kFR4|-@2vl6_Nh츱"HR#bdHNNc,I tnh;b: cC W]]ϞH8kv Ɋew\5wYh6JUzSi|E%*d $-S.ldtAB'=L R{g ;t{sz:r };9Wbԫ?W}|C%[@4){J[-Ԥ4RIuٓ!t!&hսGcX?%!=Wy$P#3o]=F[G5_k4ZmvqG"[1``YM{~kXTE_8oڛ~3xYaЪAS岰ZDx֗ o OC_Q_^qƴ2-c$8V (b;S+MA&/Cm[ >Mw|Zf4gZxڿ#BU;:O bAf.̸Z㵇8Gߒ-B?hʈV;8ZF,hD[8eh H (h DbբB#;oiKp`7ЮhG+Ϗ]ՅwW}.ۂO/|4'V|`E~W|¿$'[ =.(nQWjZ[_3O9e]5hcrZV?Jnd7n_O?w}_J#6~i|^4ȁA.ceTI7fY>yw5/98)=ʁ)4AW]v|+-U(QI>j*=1.&\}v=$" $H E'-ھif+D;"5Ҫ4N3c-6gwuV24z0?{o!X=j81 d%m`̮PwhYؿ+s4#i4Ag+7 %4VDA"kZmF񎆪;L~^b?J{K/Vr.,U{?JdVɻTiGr.JB`DKfs4X  ,ǝCqc6:D%"$'ôqh sd%;ˊXVYt ,B1i#R C3;K2b4X Eԩ"'R ˺olpP `^ݻqx.jSu۟o~pϺx7EM[b) Gg\.ՇCXмt(?.]?CZ$ҏ֥ape(.Y G+:\RAKKII&I86l>H$b9˹Cʔ3,(`?ٳ(9) $)*5^gH,˩As_@?zv 2QRI3,nZvrHm:^B5h`1j*VG%#%S(R,1Nep2+IVj]d$C Z U+ǁة Ib#pmҏ@fXo WڥG.A׉힪}@~J%b.VCr:ƒH쁊R8:C\$ 42SnHg'-i/R6|yGuQGp#hʇlyJTnZe2RohOdL(YgUc@c/pj9av^ҾݗG$mvVվ)KM^:)Ly籇ǺcM!"?vW i_@8iKۊ ,W _iKAGʗ먓5)jmgE~"4NV{cɩlBqHPk;I2jg!`35败;.wϽqg\+%h` uKMSdˎS5Rʋ5C &a+ȧ Yz=h6r'%N;vO?/D!JGg5dg&^S2]6n>J>Ɣ#u;a՞Gz0iI]]E [_teӚi:Dz3.c%iKiB(?/rm,.o2kx-vڋ$̛C4Ỉ2B(j@͡tYe!48 'LNVkOÂ:PI׃mV9Dr:ebH^DYBLVh fE@!Ԏ'z]ǢӉY"$7a0mMDA^Ҳ o s4J’JBÀ١`?VD5yg3}&lQ`W XB\$$LVP8j̎[cK8FpWk`\jjGNR'/J-qҸiX==ٖ;ӧKŖ.jt{!lUmO}G\|/~e\Mch6 rN*g'73{ػ߁'dvbtgz㸕_`Հt$5V[4`ŭn :3 Ò%N7b]Ȫ"crjlXՆmB/Mٷ\\^ꚗy|9?AV_6wOnZ=%֑3A:L>ekW՘}z>x&lqk:;氏({p_\|c_>xD<l^<8ʸՄ, Ͷ1ϖPr0{2#FY)`D baE '@h.ۚZWOJs'2rT['i%e* T4emnąWQWv;r`w~Kd[O>z88*vp5Nn;FكɹW giu0ZQ%r=]*>aSW68朱cx0z~7jMU;L\FjN7AwMO+΃B: Clef߁ G.&>J3_h1;#?lK!"Q`MXrǗy$'^Pt#)_L8DQ y\@(&ٮ`V|aHQEI9Nvm0TRĝQT >hXO²XX4YOO@SݏӘ?*st r1GG74OcE=khg|EFy5Xvs2:8N { $zM ˬjYB{k2HP+ԛCRKw|t[+yF+>Cș&A -[< 9C,B0"M*aj~sϕuh}wpPETճOZ8!i%]-: gɃ1*1記KRrP$2D|(' im0A)^i&>|E DG5UVqQ:p+}uD P ˆws͌ ʆtO1E'HsY( D:@&=.{A 8Ӛ'`_;&eh$xϣ x% "EwG&6R D1DDͬS`g!'(BZĊaV]%n߲%Ϩ gdѽ Cژng=px?Y'>M W{q۔$ϝO[ʮz-=`ǘ>vϨb/[RK.b<=_:8sf¯{ Z1l)<{ څ\L;o SW+-(E9A{Y Kǀ8J3@k:Bb !1+ e^ēqCJ }Q)8%!kv PCF%p3=mt<[I/EAEf?mГjPг7켅~"wTXn_NWJDP<&([+4BeߡBj f) Q^oPR((0]pZA%41sTd u!`ѮhP(uF8癓h,DJ- 0m̱&nm %QـKˀ h-,*9iJ*H^УРQ' hl2EM9_2MJF-$u B%&T4P$$*TࠨUkDWi7:!qBǼs{鿘ɏ7q,p̳nIDEjD)g#CXIT43$d*YQ\9@ V&Et{V<ьc+s̳iot3@ .2w23 F- Nܷ na"4pFY& < +_#:yPLM YAis&)Gw%ĤIR&1Bթ НS -c4DI, l;]c` s4' LZ M%kUCZ1\5 @mAبTWtcݬJ\!IU@ue5eMJ#D01Sa (Ni7UFP_Gt]& ǠyiݻӍǙ볊Sp?5ox!%j_~sjVӇ {r[I?xquQ?&r:kYMҷ'̢]LJs! ?\ċ5w®g^^;) ,0&nuuw߻ 2F]+/N}.`) xSKǛ.3*Z.^SصWř]T%:zCdK-, 8ZY,:oO41YTֲekiO4H1YTv,l D6EegK&=> j`1Uv|ǮqmTY|쪵3AvruWc xǾ0`]tU;lL쬨ک`Lj:4a_@;Cq5&fԵBT>NaY3]L>7jum3p0!D*|V"q^<=uIkKe~s'mMSg-y#~Y|38O>?Jb׋tsurx`e''yggVwesXs9K훫s7/Vݭr: $d)uDVR#GV͇?VzuX;같d]fDy:pG(Eg\Cbg!w~d^<=虱{tf<7S0f $ \N\`zW;YݮW#MxyNɔȿ1G^mA'>:^]|Mć7W;/iOk\Xonnj/zsVV2'^}՗VM!>j;@GFw_WOEBmk_=g``+Fr_O]1jke_j~̱Ƈ{ 7dQi~i>]||mLmoG~ȬY!CO7gWKrZC MS0R2 ^Gʄ Vp'),:~C~xssUr0C_f>p Yn? X %;cn?.|/Y45'44߽wwT a-Y%O +jx.E |h "L#M<Ա PA]F+ic2,Y Ac#4xQ7@`][aBXoy/ G)!&+gNs]TJ(ֺ#PñF 6k@)P$GQGM4]rb,*)BG$ 뢹׋1p|ʥ2Mu@=]}]u ujTWϺ-;a#%ԦCtk}YW"<t\\R0p:ߞٛ3w?}xwQ4k3v8YT_ߞZPQrUhژrcTUSQ$ ,KJ>c-cN(8ϣ;Z)^<{O]+B%g 4эڦQe<1lTz_" %?]2Bɡ8vӗ>`?Fm{!KP%^}~х| 1T eBٮ폻.7^@˖w|l3tV2|릳gZ6".y?S-ۋmw-yjmٕ4ɢJLɢ4IMcIߙ99j}65 _ޱ/^i(=*$[ւ31yӻ4c{;T${[{5-Ew%9uےU(Ә>ށ(喷[hG+QbRf(6fYō@FLSa3/5<^Lgќ UnӖ qLuP/l&9rBRvIb*Sĕ'ړ_OL"8HgLY+^1sxh &i\xt_ vQe2{b_z'5z{vWcO?tJI/ouCWWN\ilTeR69^PHX FZq _}DRbǍnOE%Hpֱg|2Ds#\hc*df=\J:Ew^}<}+WT yW&eSAu[8~M>83 *|WlZ`[/1R!~zS5\шֆiILPM7$錒5=4mö-2MwBU)_[aB!vwjet/[SHGcHjp{O="4k\o4iN KЬq:DngrԲIxO&xOo5rb4݋`;чL4ո'BWR؍Z$ A$ykQlקYiB>R4 {aHr˺1p;n~XV$~jI n#9Ǒ4@e ;Jxb6!]]ʙ&c4>b0w- 6t68r}7gf~Ƒ͇#.1Rҝn‰~s)ȸ -jf_|NJ1zBfҮyNHZrǢB lOD@48Ƹp&ρ %5msn`9met߳HNV!.t4Sr_lZƷvrV ё{"V ScQ u)k.꽰k@ %t29}+wMPg GP9Is0I#]g'`3r6/CiFpCתk^(ˢ4c2qpx;d'/(Y{pU4)BLNd}oG#8%Ospo/$q3:6h͑ɐLD3?<%ǀ֙ + %{J$LQ0Ƹ谷 Luac _YAX]S}L{!$f cu[$>J|ӂIYei<(A"Xcv]} tv";}]NpQ,깹 ~xbVXc00)ȿY`e'cǓ ^w4GP`:+L3``BYt@q%!`u3 (`B#c3ֈH0e ,Ow>C PMu.fj`?~7uSJ@EZ |`^fM%mI }i,tP܎TÉfDVnW>+&rzktD7rOMV7~(`0,x dљe{9vVت#>9OgVȬHj[#8i3+6B+k- eac&`@mUP vD<Wq8:]m.} *Y:,΄^wT+J#t@1LIi=bG*'.pBmM΂ VrD뿇>%>*~DZF/bSFK5dR7=i"sڕT2)LK*n{DŽ:͜ Ā* aGp)9\UdkuNdN,/ri[0 nSGRCs'i ΪLlpn|`R4{ M$gS ớ+no015]i؝Eoާ']xn2M/YP#ʘ+Ӱgs{ӨyΣ'^O&ȝs;QGW&2tqq띏MD Ph/D5x1C]Ql;Q~yVB"GK~@Ee[r ,jPSV24+K~y5?pA_a.BSPZ(ҁr$aM^X`DOڈ5DCJyLf{ GV۪Gi}iYO<?@Nh eIo tTj ZU0Ifl"܄>" ҋ5mUhSܮR͒.5, ,")fmJY%J%{V!UxaPS2Zcc]`Bo\jQ9'-FڛfdIGoF?<\'قG˧}3279Jtg0&!wwnfi%:(k|X&ctț&w~aOs5>|_ÏOAG||gq]lE=d,=`$}[n[;dyGE;vSя?gt:Ƨu]o;:6? 4f,y3ov đCG)zءU$ot?!@2";Y#L  3DDY#90,P~ssEnRu6,z2F|@j1!~ZCVEQ"S 9SHtj)DM ﵑ`ć2ýO68D-t$L92nt ,Ja+I锴T- Rb'%F3#<X<oEQǰ"bI?jF% +݇!{5@,x̮矮|x.:Uy 柢U3).j[SJr\q<_-&% [Xlv@(X2|`!T9e1 >|xE # /< "C${.񒣿!5__x:?nnƓw*F])~OƳ=d B[.I>|Nn -z)R"/F>0n>RrP4?F 1R`HQX/Śu(,̀W`ijAn#@sVn3Q.pu5RO`^Cu `͗]4VA8Q`Wc7$f^j8DhAY QxBN "rmZlV`D08ifGU;!^Il b#%t/p>p6P  eQk%l6A&fQuГXcͷu޷6H Bȴ 4sR@ )R`+'wI‘b|`V|]1Vd 5eQ!6ll-hAx}AizF"%^i!S5"ZPNV8A1\,23)M3Q҃Hhpb"q8lAɴP,N"L-190 r7 ݀,wZ]ùjQܤM')KS˼ee22}Y2]P#Jp(ؗys'B\iZFkG·ZѠAy'{|u]g2.y˼}e޾ؾ?A lK$(+ i% u?\)) L;@RH;[ _mY?=2O4m2uN:jdI%n,HYu@Z3X@r,X`, Ϭքe@"m <(Q-@c&0Ŝ "rq\qCǵ#ı#$Jѽ4[޺rt;ܺLrG_UnAZ u[n[ jn -kd|,ukf!9bΉVi{vz.7C8/=e>®;s1!\B }9e:iozoJޝj \FI\TN鹌.etwlhi mJ6ݏҽ(Q^½5Kn9(@zJ >۲Fx{q6 #{ۣZkpp-[6"by3VE6D<u}َӂd!wp_}~/8AvM] a]߆'i2bI,g٭uZ "gƱT*R L8* &Wr*0+ c6^D?@TRw2 *"Iq},t)ͳ7r!qZOI%̤.,U <ˊ,++l+7SFCྈCh':gl/c 2"QssH QO)B74.U6Azf2"k zNvkUd;ҫv%T MȊ]A_rT;2Gq3%}9=<`.9읶PISF́eYWklb =EBFL)O 1aE3j)dRjr:qCrJhaoaCQmc8clq@nVbO8`0rvW[e[B*C3_q0M؛t'\+a04Y1M B !ABGiXDUhV{{LK}.:6s("pEܧ%;8R]1F{*uA|ۼm3d|+]<{o&^Xn/9BZ$;[1F XUBICӠZy$b̃C Pha)_0 jMotD@THd3Ԁ|brQh{ȑR.Gƨmfx9./2ٚ}TQnUajNٸUoﲥ JoݙƘg$0DP򐅂4B4a," 1$&RJS0E1~5xbym]9T1C#ʦ r\%wh'`N@5fӴ"50\ZkJ.()_^S*4Cx@~fl9R 6ǷkvZk^)]+NW&;kw  ź2Z,[/n_;Wj1Z~-Q0"m\+duTR0.n6{5{4h,d*ڣ@;1mj 14҂֬hGu!ʼMKdg"]/-&;Wc:ږښeIыg)&=ox~vVmMV * mtj4R.8GB5tte 4E%)&7 ߷&ʣQ߅+mUXICۉBBԩquu%r$U 2L\a1Um)\5jlh{5"biLQ>C?Β۾6 i̡*E1hJ +?1Ld؇[."McGJPIř}HgEdy(P,DT]^ETwԖXIeM+x\*)c N:YwF);ՔJ뤒ųJUR&,KA/ڝjBnR4(Aq㝭'gN-|ٚ0?t_X ' x4 QwᨃQ`E]9f;h|BIj @¨70 V+[% !rݐnS3&w>&=3mptp׊شڸ|\i*FH> th5!mB5քdfM}2+Nk蛮[#];Gd1YG_tKm(҃ 5k˦:C: ymRP$ SE]Kt& ?_c03~?KgoAΧ ĸ֝ ݾ!MlܢV:q2 k'g'.n"E~HYEB*d>;֋[RA`+sm0M]x(6QF}8;Z]>[.ÛY(=(E@e(2T`P3aciZ oL3ta3~;]'!L>@<8eD Sw}H蟼f}Mѕ9(AfAM)Ub/_AvQCq&p- EM5-0mhQNE CY= 0`)*6*Az)]hZwчltnY@\D~LCaAy/d O=js1^vj$C`F0Ґl/E-O/"4m7E,BX܎_%\Գ*L"g<[%jyʠ͛$,tƑ@#{+,jNznۋ8N@vh5m*k\3˫WWug!s\7Hcrnw1G9=G)v&N23D e$ 0|j!c< gIVr* a(thLS4>62"Ծ0!HPOb[Q}Ilf2Ncx=2:a){me2GX~FfL:)JjGTXp__ܧc q:s))#zexnS#a8{c.$t/4&, KPhbc6ax9\*dɹ2-XMy(@n" ჿXYT؛cyBrTV2ZLx-=ͤο5yb^ўli\$'( B輰$(V {8]y?y毱!+w?ωèC|(0g34{fn9'I*! ;ss@`zkW#+Sp2L_1°/0nWRǥ) ^O\a*ӾI0SRvϽA4%C[A7Er4LI?cpd_À?/O`3 ufgĮ][iͽͿ/wj2*6 Wz6dش\Vk=SwSͽ'l1_y+HI/xAa$  6wީÓf N7] ęy#DMcAN:9uЁI0Ano'E<[ ̈́ˊs ]=#VHxuPwhz$15P!7|YrKgQ~0Tt~/;\gJ{8_! U߇}Hd1@$'_l,iqI2V=Ùᑥl.aOS}TUw=-o[V͕hѵgmۂ6gumx 斵u@5/8bASZ?&BY nRgPw6a8ӯ^IsL#bpUê~UןL3uϿK?>9x ;s mmwf]}; 0y߿|>^˙?:i5_ߥ =}VxMv L]4-w~[ШguTsx5q~^YgԽI/M!&7WFF9IdxędAn`-z5EHrkp@R".Ux߯sBecs+Im8glv/Nl|v)DGKQ8?TP-8g9V痎PCoOG8;1" mgVA|C‡‡Wώ)&S,tqCS1:$ F)Ur|tTm ` vӏ18L,,vs%[zVDJ%npw3H 7 ` F9S+ZS*aH+P%P]U+'B9۠GLFW@Vs5ӴuQ 64Q?dcUB5!yba'I+ȆVwmXJNnT><.~7=sU_Q&,f:5eu&ʶ.mS4:]"wm0ՔmNxGim n_/EKRyxȩ`pMYTE,Y-Ir*8wQ32z{c ҟY?Of+믵eQQgS)د6^qz)˾?M[} 퀝izJ~m(ϭx QL uWa_/dK'$P^=0_͙pΥS[8 7\ZjC\ JObrbvԇ2 3BIv>n.Wg}޶ƇF޿G ձcFqG˄=rw0ouƭ*+"VX]a2Q,0ifZ|dWY岨|5)]2WZ:Rfc.JоsݰǶSE ]_SC3Dgd؅a6) 1K dp1MyQ!%Qg˅*ccINg#h լwx%}++kB1W>a]xj8vFG̓frU!i ^@ NrՄB9D:Lp~A{wzم-)40G>魓6*TU"jrO'+mM`8s*%͈Dk{*w5A xɐ#iWc0H›ފpiM,hL+~\IdTvz^%MBO=XS`̔j8}Gc@&{SQ&θ74c +-#TL9ekM5-JlJ夑cv98=RvG]^ѧ۷SZ!l*H~}lf_7AV۽L]HNdcAT^ nqaXƧ t%>ćx>Fgݾ3ɧݮT#wm&u+Ou0vٽqm,$:4na#,ޱ!GCi2^(IiL1,AT M攙hQoi]9vpZ9fŗ+!V֥82_*UMŸRUZ2/QKp)6c L:*#a鈪ʘ) 2Gڤ\fp]L l0Ƹ8J ^(etPpns~S"h&[Ը2E$[#to4>Y)Y.<Tj%W`G1ZӈE #Z$tac/L=o}SXHR#:D-gZD F1XF8F:N3Z Q(5W9WkRbgƂ %ٖ{VE-t_[ W7y" |!;V҉EP] t#f썳[6xEޖw SƎrI{g5ԧd5{Jr&$۷K}0{w$0٠{~Mf#r _޶~V&mMCxTnoM['# aS%0HnDtjh .aɘUzFT{o8,J$@R3"?CQxp(FAZi::W.;\8RfV??7>G٨ xN2֜f)۔6ʚJ"Js #gŕkO3Cq|]rP"ƣd&2 }|)"|de`kEqYcpho)\b~߯s*1I*! C5(1~?՚|> Etӏ1 azgvzdLZT*d -F\YtXdSz$f ynwSo*H(a%&XA ^Ya 9b}* & DkE^0Ș#F6^Dy?xqO(2DŃb_޿ik8ϔfʽ;'j,[/)÷+5^g^blDZmO[\XM?Cm^RcBڣD>.7:wSWU8<%io5XBWnZum2\Ao-e=vw#ΰ"ex s](V(@2 xS5B6Q"6+`Vΐ^=GH*04FLzXCy,a+CƸY79(A4 9掣D9B=gQ!`fZ^ Iư?_TqxxYMa4vVd|V6NFO>ȔߟǚwK]?gNJ"ͶQ c.x4CWЭ#45s&uZ-Q9D<98o`jh A eM!4p?MX`fIVay-`a;0+NkQI2`B"Ј}}+0Kg T)%9A{ttZ3Qa+e.2c\˸+o-;/"<%Sx8D 0BBoJtm< 2R&Z3;SUn0yA{w (.lN_2*U RǕIBB/m.rNv_թ2KZenʻ1 K+|4SM|p:8kO"Iw>DX v;N9En^f}UdHZ{ﶊiṸuT rqom2$N THȩCާY `*Ie!-8 ːk&`șll [VJ%ВLD3 ˠf!n"m#)A[ӿ~uܔESci)Y"Pk LMj ŠULr|eS#7QVt jt^[$.yY;~';$}\!)nwS,^x:l۷E^a H)"]h4`ݺw@)p gA'#V&uyכJb݆9xnLg!Aʹ+Dq d*Rj LS) VV`;ɄR)@j+HҜ dtٙt G*Oܼu$I8u]eܲVKo=b /u>AqM1{\nm'bs0VIQ& 9 R$fXشrC axnmFt2X(8uBM{?L"nv]$t[Pm&A/q̗+E Iqǜ>~ $V%ZL$^HM&@u% L(9­~,<+Iwr_*\ |z851e̕ڣZ@{wjQJYoz9 .Z#ƔgFۉp=T gHCzL|TW^;fl =gٴ1XPٰ0P٠Beo=(nb@l=2q |Yؒ\€A\86[У[ OWcO>*;Q$}3zcH׿_|2 cͰC!ohߌjW30Ӽc<ζ0$0bb\wGRg@f:Us=(ͯuDZ1au:^HQoGo}1LLK| h 19g]*zsqaaט3(o~?LW:Wܮng?>䦭޶hp ;.t^ KM|N!XXAc+A>*c} e"Xl5ORY ŷ :/|IyG_ }Y(.ҟq;{5bk^<ݛ&Ћݜ%ڴ;zSZ)1vc)YN*c&>^ 'jTExV)]f޼vJv㓺jk RkI[B:{YK l>n8wC.S\5u67RՀ6fsHաtJ~T*ץ캔u8mw߾|wZBZ+]γ_PX.G)ß!UF•Ba(`pm ΠLe,!+IkaV(` t:hNY#!2C 8}>O>wa@N\Y){T患GnǟV(<92 P%8 f%&L|?6C2p'2'CH*% vtE-% ]6}q뉳>:m]7-qTHȸP/H ^$'^{f!A-׍F身_WdE*SvcnpNMhcypmջVTrpX-[1-;o1%HW>/[S/[Bs-n=S1C0\:ERK" L⩵:#27 */fA+u<+]OiU2zIgݏ8b1S>x Pyz WWecH #*yrC><1*}Љ`B_N V:*q5d !@R5QҔ*mRTC̢ƐJ\ 5 %~d{cĔ_w}L5A 7|Ǩ!'|$ue>"B+OYPȐ =Q&>V9 N"&ɂV;9s6z9 fsi&ɋ A7_0Sr#)`B"_I!r4zP CW6jsm] Gbt:"08ԓ@yO$\(" gt gW,_疫wY 5(]WɰWơty$B-Zs " q (D>@H 0 7 $>$tǧHۇ=xǒ:+KpwrH4PLj)BpD=Anƈ-Ƭ7l%36LaȐN4U'j"d7|n}|5jV^j>XOXڳAÃD#FFϺ_7rybp6_=<[p>>ӃXoFW>Q 8jZQ֩VCP'K֗RD""鶤r4<A- C;id*R% j.04gHzЖkra eAFϧu☡^c2ȁ@ Z-ꂥ Pnu|p;k'{L~!u|7wnt?=KëbfU$篥+|;_Ηx~[ ,,̀LC`Unʃ) NE&X 20ݷ}K🗳&2[#_r4?u֓ffoVwVlK87~DkSsۘHc-n-Ag n}~`TC(:`-b3U]dzݷm1T5OmŌ#^ou2zgS+ Gy^/XOS~b}~pOjPCI/4RI %MQUMqKyË>jHSs;^ڶ\H&s>5SbPr K1Jh%NLkgX AѓUVP&lK$r]/[Bfb2e,BnK@gP+L3N+i{ J ~nDy/JP He|6=K5 &P~{0]G)zD t]z WXH 4#k9i2`hFkHpXÙ`A812E?c%ý4h ߁tmBCSVp 4b30+}Jd1nwt-VKd̍LJ;I&]FFR,RMTR"q&>R&%I@-=y @v[puQrPj7Ԧ-G53 Җ)A[-HfVƧ:*Hx_חC9Jܟ |~I euV`DB-;@7;I&QQ5ԘJv2#10B m &ȤnXg@*(qeyFDK7~^/+m$I^GՀƳa3>B-c#)J*X tK,fF}qdDfFD :le9]zEz/]5!tuF+N7uRFS2Պ71Etr/E`N>"Cz?vy|L6pMҟmG\*\ @+I=ȵ4+㧓KR;ǴЇAgG`9ǙQ<h$E1@9{|vAW"M^+Ɣ4*= [MI9)=s]n:N_K?Ңr̯rq'\Sapqq)|bt0_1jLTO Ѐ(5'=ڊ cI~r2[Wׂ݉azr);%fFL(߾*9-sۃ._9qpqsǧˎF΋.mHF'њqTz!9W]ǧ-6\rR>* w2]n8%]C }uҫIW7vk/.0cte#靈}];%1*I1[o@]b˜vɊe(Giɑb1v2o hM7N@xNڧ\CCj[\[}3#ɇI4Pȥ/`)l~ s?s*b̼[y;A4NtΠw}j.=b ;'s' @hL :y]+攬bP *)Cse͞Dmd}+݈;dT>'IlP9q! X[4>yw̖я9*r`yKaT &KhjS嘤Rv'>NnpyeUɢe9#8 nZ<hyEIF0(U0K,AB&94kDp$f PIh# P=9\J'}W;݈@DUN.Ip@ꌨ$rPφj|.d2ߺkY;r#o1F@q/1ҫ N2ΜgҰ؂j }k|[~o߮?P`y0&ZB\hY,}Hp`AY. OTp PRO8{.J  rҞQf%Z7OosDrah oik!\[ 5S\=>N=-^Ϙ\vpqnҽe ^|z\ BdOE=ɔ0.*L ( ZʤAV|kE!+ENtsAit藫*sur?\ۧsB?/l(MnFw.ohV/13ϾwPk'B&>i{NnC#pS>W3SJtXbw:QbUpRcvkiaR拕D寮 ғ;*鼮Sh}% O9vT2n.=,u:!ñYGGi!仭;νVc U}\2g-"U9M3(X1BlP=6!=B^h> [u?4>6Jwv*}rucO4< 'OM)-z xݛ9@% ` Byڍ-''~^7β 8 ]'I|R y^i-^SX[Rt )zpSȜJW~+Ov4S> 0嚩([=9;u|*j_j0PI EI UdT~b9tDF'MRD$R ZREbUkZ)#mq-pJx0UϾV?/Dit&UgQۉ>~^8#`tpQ\̆Ο9GXd~5֓ȇ,a9&C|p;^s׿ sbH?)Y}`NlYYo-ΫQif8z.`Z6|iGk+yޭ؇x#9Aӌ"Fmvz,R1gCcf 'S9TUk%š*&)JU?{nǵ{&1}$?xb=Z{XrǠdCVچb8![* zZN10NZQG]-ewj,Ghӗ{󫋳/vz~=q˩IQܻ$juQpXc S|Uw~we6j٨]fvY7jaoM&Z2\0cS & LP#{)5IQn;ytNJt(B6n^Ë4/&Ԝf3"3~=٨lT4:v_/%BV 0j<~r17 $7sןhp-y!'f6K*fݪA J`j GDq_1%@'1;oho'8B ʃkm4QO 2~ ˝w\@J"!X QƢM R5 \,QQe9[A}׌c ,_<3_8 *_tr/UNHѐ[ j y6.qM;d] [uҕml!FZ62T似!U KT8£[dRDwq. $'r% AY$yp!Fsw1p]vM:xrX݅>\̏L "_"FG z phm>`0 TqDSu\9u< !{m.QMI4F {KIE|@Rs;Y ùTXq]sNl}}W (2GPakHB `B?o z>Yns[/ uwW8\yʕZOpsŁ׉ A!5nI=D6-nŕ]K|_؇ rQUry&@h R->̽N:CBGyJtJ(D5JSy0o3"'PW1,j옾^߇Rw[QASbi9&38Ac/Hr[B~\| O[OJ`N×&]CYp(Ś+?g wJqg?֘.{iڳDdLK% }GPEpQU*mUJ}ʹ`J57*9))hF?kX8nM&&50e+J&=3#9|s ,ɘ&SAsA$#^ʭF7DZ2nX.N-Jrɸ8ilEKna&Jqms1"uzR,^\Tx ,8'-&)'LS7kK Vt&<üO-jɝ1JٷFʮ?L:n[M5)Bq}/A8RFN\,Fe_|}o-›Rd~v*z pTMŚJ{/NHkүSi3gd@:>n_e#}ѮȹN9CǒB9vUcZ1X;9' "Z#!I7 dH3dd2*c9KTf;D5g9 aUT3(h*|89:E%9T(qHp )زܹ1B21-rI9QXnskYXӸmB TRV܀zFWRT3B[p̧:\" f-WcQ1GZ aҚ/wcs8z{O{~,۾SARf+1puUddWF:ۉdz/fs; [ ;wHYS`G%(MdkoЛ|6Ni +xJrEE3Gl0ҋdOq!8k10V%UW3uޖ݂1  G KY$|^^ >- |tI>q1rbo/OM{H#ByI" }Q P"$,{KEG(cc&~`zK!8'[`JxOSjc2%h,[wA˜߇NY\*$s"KpЄ͉׽&A =*W>U\+j;NQ=Nha>6c0dT " BsăH]i~iΘT`68-ǑU2#G ]vۭ*|eǩ[9Mw*ElC` ]Rn0` Qn4bcU0|%\U0+& .[ kJNsL(b^#=t:. )XͿX5;×ۻs'^T'AuN0D,g >-L; Qj,p/5/>v_v޼{0lj*C9! :*%rUɛ3DT6O$eB X>ͰyD~MBV>X<9!3^ێ!W s쏿~?_78S`Ǟ{{t3mM(S]kD(WJ>PkD} &J+XYK| p i۪2< @#ՏxAyS}/NU w.'l{m쫤\]Q^Ϻ[\"+,rd=_"pk+.sq۷:CZ Zp5Ѿ~>+Ɣ:C†" zNDICf|a, a:7RN>9HJGU2w>fWxsh4|ThB9A%U^R- `,"3DN9FSUBPǓo=?Ȉ}_’/UرKx_QhJsu/)9-'fWk `f5IN412ɕʘW2Kn3gStMr9T*ECe=j[ iD neWPKځ6WV [hOr>3ײ'Ѻ̔jTpajRwWeFȠCQHP(I^0 +V V 6 }VAW2UGq *p#ݮFsOel9C^[hWT(.kaa6b+n \?I0G!״919컹ת,f# 7Jj͛`mDZ3&Ja6hF͍^h1D;t( W3~=:(DI&i/0 nplɜr`>w~λ7X|፻t#6:%*r4dFT=v8՜J!θxV`8Px*VRD7!B *DQc {,U"HH*UI6qr}%4n.*jP:k-c:;`їEkT"(WK q|Xc2Z%;' LwRD|X܇T!TWEx/ٸ]+;AK`yL@"{Ii3W/ҤÎ܅j7pa&]Ifn҅ b{5 ,Q:ZM&3u!$?sfqN04| f %qשV䈗[39.$d|;XV$݉=5F/5E"? k<b,zdRWBk5mα<-}CBFf1ikAb}pv-߁0&҇`LBw$z<,R.0_10jxm.0k fh1.~isH8}zE^<X.ELf+](ބR`cJ|;jK$ gRKfTP9P&0:ڲ"BL*Õ3lg_!?E_Qz\^~g^ [8~F>Kw&\bvB Nj}?ZąJȗ8Z %}TQK 88 R瞧kܵ`UR[ӦRӵ֤+~:Luam+`ChW <_镽+tԴlo2^ugvz<дgeRm9+s*ɱ4{ %f4c|Ӗ-o3gg;l6@jj$-4:=䔤3e_07l<9`!B14:%dx 4$thkB_Hn4x1ZQ/H4pV3- )˚hSi)qzlv["elĜhʋ>Qq-`m3BT^-^qsPkTﯮ] F7[W#=#7%C%|%KqS[x`,T꣜ (<0P0ev5u.'1\Yl'}5\JKy_}?fvR;],FBV}'W7vcbn,7ydoꅼkB)8*^ N4l=r(9Zr҂̩́Hnb/9t{yȩ:Zcҙ`r(#\$U^ xVپeVO1-6gV.xMЍ^\@*Ô&QqsPE@npOIMxm2]?AjHS3ήxw`'7Cw2j].hd綵R;ήkpK^^7w7eOTݻϦ = hy7Q0gsv&RЗڛv}5:gf*YIuFxT4rǻP{V5*'*q4y"Axl|kF/v hCOfe&F å1 6~E{?T)!_#&Vcm<ү'+F+9#hcy$ W*Z!("l8Hl1أt}^˲fŇ?d@!+MAUrJ ZCd? aur3\.Σ<(M/WS>gA(a 0ǃA3ثp -,h&-o X.f[~6?JL4O5Z'wPHzㅧa*M˯΂XNh j% ꥂ ^G2?Syug%q̐ݥpukVakA ʋvaYց"*Gd1<*2KīK}OGu:q1ts5)Aݟ~ڛ{r!Z> w?ξYMǍ,zYjvnL%JhصH-.˃%E`ͦ-0}q19g?5o0198Jq](y{~csf_}s]퍗G#Y ۻñ̪( PL9չ:Aүt7f8}A.\v ynNܧrl#,H`J I%oq:(ZB9U^%̭ I]$ K*e67u4fKŝf`>Fj&7(JYˌ2 dfLdd9T0D8`REiq9.bd@S GRXMi6U=]*OrhXDȥ`eAS;Gqt\`HDK#N耈\hHQ Lx(;Sr&VtZNxO F_H E4p^=rHZp"' xõt94xf4KWQkS} I-28ͣQV$j (@0AEO/SKGInP G/4YCDٹIhOoZbWZ.֜h.vz}u$|6UxG8:{<OR gw"<ۤ5W.$rhr'KF'ׯZ. y\7pٺ<_;"vK$H))~v~uoR%t%ۑy]R ZϼnkIiơJ- XH!N!IZό:Kf3 c9\Ua];NH S+c]7W0,l_p1860H69F#lC-N`G-vWzŪٗ,L^}igou'|(kHrQꓐ%z@g 'I_˜;' TL+(rI,[POcx.KQV=.jp%&?1.LH,w1j.)r+E2\1#0سv ;WܿDU `Wy簌F׹2)p*_?χNFJ;`+L;sk";0՜wVDXS߂ DDoPtǪp|aRT_ک }Z`M5b?*ɢx+ b%7$NAu[s760zhh3q87?iO{R7r+'I^9 `I}QwtrIcLOx'3mV={?']UA䭘Q&oeGd igo&ǟG-|'nɩxﴇw4ڦn0Z?-G~bsx|Ҹ=r*1Q=2CVu\B~UW?\XFIQ \;&ϥD+d AR)GrGpz;lpm{. G7LVڏA MB+%'PV"l43;4+AxAaR٬{e*N(L+s#<&qk Ұ<w!8d<-UR)s~0?M362)-J1g #B+ˍ5Z,&98d@YqͱZYx:EpywL4%T Zr*նXZնXéK2OP3y."9y7~O&WԶFBS-1լr}plc 6Z6fDps{xx9Ճ|0Y !@,73շQ+%ջo*LUqi'M>eꓗ8ٷf7,jZʫL2p.'u`7r6 qSsKFY.oY5ͤ n3+αZ1xHv)+N@O;Y3gF <ׁOEZ#!Zn2ɐfȀ"ꂨE݊w~:@|~񽟭f_b'Xo$_h7=v7˅A`!$sp<QM|>]!lWJmYyAV+44x `İcL ` 1ZÒJK4,+E.U^O -ׂ{?%rCn(DyL1!@L*d@b$9RCet9+ 4'9A[ǝD1̹bvbh>œ?.ҵcq6K, VOr݀OOxu)ϟ!o^?}~ɂN*KYt5/o`Z"Q] &#s& !w#C=;!HC -0 t?x1Z&q`RP.j|4 ⭙&3 sMﳙekt JcJ&)'tr(!s8[CwՉL/}|&Qƥ} @T([pj$ˇqt^n+m++ }*f؛^!/8Eop;͖Q6p<8GL ;fɍfO0R j!2 ;H!/Ĕȫr{X3][2x5b 40DxL& Br.%̠h|Xē,pa`v3;ZDiq&8 C-666_d_V"yK UЮVjcԱIqE  ـb6? %cYWÓX[ĵzxN8dm}kNj*b 4Q*3ũ ;>R(5UJaq!˾9Q[k=BTO#(!daUkm/_Y05{ cZ3K)}`&qv{3˖4{GkD%uǨ E{>%x\eKx)t`YjY@ȻY/r%7 <k{Y+0/V9.įT1U@ Xu١<}S&B%"[SͬZ: f_ ;^"ŭ|9bY 0F Խe;[Xұ}pgqe*{;W_T"Q?/xXF|ZkwFq}V7~BܢݔizfsNc%jX3rcA$ZrL)DWK]$j>s xBy K1 ЅD\b4B2'NlIu|1%i]VY8ce@XpmxgI*q9L'3-Sľl O00Fett…"쐔V-=9"ɳѿuc-K)wjGIVRB 410[Zs!33 ~aTj  VP9j^?w^GA ^ރ LCDﲂ˳, X[AI_SYDxPhuZ$+iCt+}Q;34ȌOwz8@=eBAw Zi,ezCXw +TRn]!P)eELme?GN,ZU֤'_S࿆1nx7o&4Q5&ZvD=KU?8Pqga;@-8FfuMv;v7XK#* vG.*9mErGMJ㒫(jKܝ0_AaȵE0euL*ԯWt%Z&Fո]T|2쀙LEa-w>.eCx)0L lUlRˆNV8ئ SkuOn|zU5)^UyMz8֒i(a|OK<̖`ZڭŒnk%Wν*czG2cAavR[S^hr`-2I<8<\`_n)F8kǩucտt4!q%P)Onߟ^|̢?ϵf.Rwk5Z+WϽ 8 ©H7xsHgZ_@9(A1j4$ [2Mֳ1[D\:-irzam94?`}^7_y { io[(!Ŀђ_Wow"<"R0M@6 &54pk5Q8n+iMNIT4 [bq`hCiF #+]K\3߼߽P+XP%lQK/J)C)f_$1IZ8zk %bjً5"?NTlkhvUJW?xInăĕXLOwO? 5桫HYu38v2HynB/|Ia>iwAobrDz?|ƹ`TE/1~W,7i@X=РrЯJp6Nװzk*(2"`1X~M[a}Ùbvzs7r^}}/û/_߬~ /n_o~|cYh9۠~ׯViV+vj /^,lU47x|z} w7z}J*0+f80*JEBb_HAV^{S^isgנ~KpEy.gIsU|Q04 Nv >W~.&[4= MB3rK;610BK_Q}Zmu€h PFOvNROU!0f Y|&$*CQ87Kl<4٪h/wlIwl}yflhrZ]Niu9.Ӫ1!X2}Ih>k%G7{A9a,1pu'd8N XQ{ۇVmI& w?kT(Oڟ>N`;V3 *f1p# 6 yM<۞eqG X4q%9ghrwxsg~.i@suiΐj;bufҌˌߺf[OVMswp֡dxV[gv2whmys֓3r -R4X gΖ8 ' eV^ 6W^3,dLbܚ:SoOLa+os rj7yn"TưMpu~d1wϣRv>ti6É C(P z]`ŤSxL ^Ӫ1́CENq Ьs~:+QLD0ؖyA0&tPMCԗqhGm2JIֹ%1͉@0}1L(fzWc*<񤆊W5T>,n] eT47X1n1[BXZ D(b+U\G`*geTJF܊,Qe Ep2DFNfDyh,☩j]rΖiU]x%hfJj5)4LOPϸ p=(*i(8E`&, )[lwv! 6 )Ww=J 3績[$̯Gr<\hcVHC_TS`q\ ՜#JūW&eG ME0Z '. z 1EB.0Cp!5D dFjȋlj "Cڞ(}ruϫqBSp:='EPζpT#+OҖKzb԰4;j$R7Ӄ$V$lNB{\)0*?y^z@-xzgX5epԪs JKV"Gea)2iYfql#ae0Ē0Yi24qa60’ƆFVfg2s>zj듗ľ|Xrk6x=y(VқyF|6ܚdkv(}\TP"f tE[#9O ~6w4}R;Y ,|{S_+%f{ 3X=x_ƔJKs88ώ_bX &B9A4JJ4_RDJX?ofQiG'n5-){_oe% .JHJZ灲ӧAulL@LVI""! Ua,:X$q$4i6TJD$A\bnPTiA Wt6")CR4I&L'J@"1q-%ޘ0m%M(imH3-&>EB-QQb828ed qZRP[- 'QгTRq3c:4QB6 #p-%r%@3DF&ĺ$0+Ʊ 1@k cbB-X:2:!17 $r :&tٜp1* M}FCX#(xlG\U|߇Ilo0@дO?Y@^*8P ~nӅq2) ΐ+ ĭa/ ѽxpzAo?]}A)Y~_7_ROpû7}f,U?ÊrM>A SȤٷ˝;֭){ fD\IjA ok=TRY%CH0ҕ2{MVw>5$3\(Ք 4JSFW^).zKHs$G~ aeQ:D"Յ+ܪ#x沑"{2J/β`I< U!`4zlFUIex҈l\gl>_7o!XԊ`2F=Pq,0( H$aLG,`YaU*(g)c4cZK|HD|DIb9ja *lcg&ؘ v^" ku3`"`.dzKH8.\`Ah&E~^['oou(oO6 x)4TX$Lij8I8LblB+T"xb!2ؚ?|>h^f<')~*hhg~KOYN>sS /?zN嬮ꉧ.Tv9]!x$҇nA yp&蜈nj%\e~\.^A3M5w;e nrO Mk^ӅvZ)A_!Oɚ٫ogMo`/ajcYDӸX7W7滋w̸Moh#+pK)Y(RE=(iTPR6e'v4hQ C #CH5/lh9 mdS)F86#8 ãHZCqdʬ@F,͙hiCym}z7r>_Ԇ,p:6 z}Ƞ>|*uk)i JHPhLnR3O'S)B%biG6it_@)N9uI@괐3ܘ]Y넏y o JU Jd7i/\ƭt[hs}O{."}Oj ~yw[Q\eő?"cwOn߅Nk/wːKF4j/,qEeig(.q"9C,$΅_;ï~WGr(l&)^ 21̆ !XPjCJPy'E1,&ĆQ-P-&jq-4.]tf%9*{wgyZxC|o޾}R VGzzjI @^W.DwQzye˪ۍnispX~d:5i?G(txKe \;07aJƑbCGE],04$vW|,E6e"I-UboV [L?(:Ɗr7ӏ|јDbU9جKzc0w0Jj-j< pVHǗh5 kϾtZnEytG]FaPmҭD}[+KК54']P):FvA `Zt+ct|VEo** sqtw=Nd S69&H}\eՖf$}v m `GCjnԆۍ9 ރ÷GAEi V34%o;R~,m*'Bsƛ$: .Pr|9Մc='z8_~^Qxqq`|b#O%:*rj)G^m\*p)juHIֵo]Cv몀ȕuy/W^\ -x69~zy+9o73W~,E8$EH~u fZڽ_F_8H!2?a8_}];5~^#mn+%bB<}7z( SW􉏻/Ů[ӻ {|C)B'; '#8 EDVzٵfI}<`ꑘ REq tʆ{%e32gp LeQ(@0U $`$(vMA K"*tuBCvmY ׭aJ/UsN3 %J0K}l%Hd\&g)%jBC98 Q kU&5Xc-s$.Õ)6d1H&UqQ!c#c-25+v ;4A8!rlq |x *2kǐ$ LG3&I/Udҭr1]Q.F7 nO_RЧ/\w?_r%RWvT ߿OΧY 0UyZ@W2R M @`Dd(ُS$ ?qHaqb( FZ lҽŹymʫ5 (l[0i:LB-@"\ sM'Jj9 *qi|EKS^=cU6f5 Rrt uyŢD?aV^_""7pMY7q~MbL#^_WmK]i홻fTqH?u30“00 &89 C?͗0ՐsP[߶LWoF=ceybgap|!t;c3ح vjq>r~5胜C [GUb[;nim ^ϾeQ@ t $<[x6<۔s)hrI.y5~']2YpLUFcKS1ӥ [%, fpL"dхR93[D>|v+<|5:Q#c(/"*pGDyIx<HEcL0Rr1.5Ul|>K٤+̲  qX 2}&;u$2>ϼL h/@:cwmsUF9PѤ"Tǃ  xe`\3UZR,J'K4>NůEs77&]#}l\ `Y[Ec2!嬕. ^jȖJűa f%!Eת --r)lRH]rêqtj8 /B  \p$HRHII(ݗosެc(.7OB Q/z4VMGqci8;!PB&A ǔqAz 3qV!d )=9aj5ލsJ|wxŔl<+ >=Fvז92Fw >ٌ"[z^-=li9Ye@;*IqټPhDXD/ EX#s.5i?a$ Vou'CkYBa3 "L .MH̀3qp|<8`xI>TN/[Lj-~XW5U51We|L U ?Ūl_r)M[99b_O9РhS2;z/1]BeBVx:wE2|bD9HWƅt.``>F1M K\ j;7X#ōN1\XIGWlk{ 8s+߇uEus='"DJ[=nEg'H灀ŠTXdT%3aR"APg+zs%pLںp o"`%0 BŒHUwuEN-Oc:D\Z׹J0湃* نj$6:B8q,A(4cEV l|8/(\$BMV9yi1 V'Uˑ1Z vk§[e=khQN[ckm*L6T4|Fڡir2_rkGhVԏDbQ#,Q2: Be;JJ j3oT0%qmqy`p*v?-KB]oyQQk+C34c75ѕIuכhָT^(lLgW0Ś#򋧚wF' ic2ѓqU+s"ɉ<;wXEg3P`)Z8wƚ}Ӕ"V^xVKՠqREX IŞdcv&ƙ%OO'jkt~}{5$E k~6j0 ./^X(|/B ǂ6Rgǒ!RB}vѶsd˄?`qKfQo9<7v).Nhnzxj/w% 0c@)/OZG.HjÚփBZ.e:cPV0|9JEYAa$tr!,E #(% $}JvA>6K%5?ʓp;/Coo?>DȵwhT B b0y EƔt$IID1!]c~ՒZ:Ӈw=p?}{92F{!hNvCkh#c]`]gFg_!p^n#NC*D$~-AkMl`BXc/Qe`HEW5ntMM ȦeiuQ'hٸd^hPee&U)%I. 0] ˚ܯv/ u<:"`m)DJ1ث;.AZ0x|F|+#)Xp4zADޒS:x;4>Y;.`{ZzLΖJXyoY[=)piIgwMe- Kl/o2yjϰי`sMN^P)D?8| go&גp,՘֬@zRkzqjWtx;vvn0^}~dVo -."h ah=ʣh==Q;ZOC_-ɀ:ski3s &Z.B`$F rOa2+"1fB|z| r"XUl" #clLwd #{=^Ą3wbs&rZN|/8}W7ǒV'@R[d+m4w؋Y|hۈOHCk .-1+YL@DPƔ2)TR" = m0.::F%k/l;3a% yjqhYXKWL(cxtXi.~TG˯]u)\w: n=Cҹm<*]$ӏ~H5 ;Ol,Ϝi-ef-]]uf,q`&an ?N!tRW hQ\0i:"j5h^I`( P4oѣӁT2r,zw; \QCZ8E 7^;?J)*i!^ܵD8~ S\e>DOA xl<>5{|"!kҞnt:Y2f˶QG㽖9WR&O(ƂYAr\D`5~uKN %T$҄G$卝ڏq7%x~Xyd?d|9r+Ÿ4l9Ib $dl8w˛xe]fzoC\b18ɃcBU?6G[#˜`A)`rnCY::}&vZȝ{aل7 -5y>/=p~}~cوϡ"F+Z`Tԋ4>Qp# YMygV[řYp1ak͇yt/gŞK-xj,_*ByKfՅJ~ɡڧ8NBU'!D!8h\I Hyaͣ`^iZOoh@ANh{aa_ PJHgҾj9)6"bIC/)W7]-yfje6!:mbs6_M߻e^)%Pc.TҥW3yzɳ$XR&i !W**E\T~ |2&Ӱ0j:ЛчunaC*j4Y-bi|8ƃ,"7]cẢ72^K2N9~q{찲Gn\qܿႲYI{Kb^q L>[߿Pˍab7\tS|>G0hQxo@R (Txm澩G _j(h֬A*Ɣz=~-ƥӢh""4R܆C_QlE5 5r I ˕%9tlG4jЌ 5J7 -S% Nx:Z`(׵C B}ms798$?nL:r^ޫ6  ϸU22pKe SB)P[>jNoԃLL;@lJ(odf 0Zi>*}u@|ȇ(QOc>nTmVU[o$ hd@a\gR8lI3-b9⇷i F7gjt91 EƚU^r b:h;ZwھFx[c}wTy*T 쬠T9=F6XUm\ -n9`;VŁGd .m3|>P)Ɯ u-iQʽQL>w(NҐ0O0B%y7tRRER^"CuDj@eUtZV9@DL(ʤ"Ve\W"M&9 gg;+Zh:*4;\2(E{I7E#s;'p 8DdBWr dNRdG_ :+r% G$,1ʆ$!NȔ:ah%4 ֚Й66'SП-<޲˧<(DʶD. ԣ~kizG.hq^òMϦٌuQSډb3 2 1T%ܓd09}"fفL;꺦L+'窛_sܾLq>W߈VO4NCv흻fb"k&x$Z,q :i[oD/Dpő1ޕeQՇ޵ܝ P%W"*(u]^/ŋ6ag,^/¯+ =.;5 aAKbCHkA@&(tVyPfOρzE(Of*׸pkk:v}z@^^'#O{up1gZ[z=?T/@" SR V*j+Sg~cS uD UmY=+pt{. r Q -xw8_"V K#$,/=ڒR:>\@*ӥgjU330\ DXOg%Bs2b9!9 SʨJd0HHp\9n'5 ƑJX^憼!en{Yn[ŒSOŨ\)cd$$ˬ3 בn{r2RKӁbi:|芪_] R 'KAUxñRP5&o)U#J@JY˒Q1'I,8mq^xJ m "JpTjz:0̮.ܬ=Rg%І #PW̙T+WՂ+uQf^?GoO=<Ygs1AΖ_e_f~vwFF=ڧ77wk)z2d^;+^=zgAٞ?bZE ғ5p%#b|X̔0T c)Rja)f/R5ׄRƢK P\vg@˝̌3A (ћ:Q"CpF2ASlmTH%U*GLx7_\LI44նeRj 7yRaD1!^ۏ)` (ȩ4X&Ό} tFcX~ LbncẙR2Ql0w" !vHa)8i_(H~]Fi5ȵ%v2OQ팧g`Ѓ#h(\6aQ't$S"H1N4~OG 3 lbZh4Ǭ^H)mRVɌch&tUJu [hfR3a5/X-ך*Ѷ*p/^j0)19|itͦ/Vb>hݟA}ŸIN2%]~͸PMD(jamꁯhZrO <εvh7Qw#a#z3M4He'D9HEr Fb Ẍ́KЄOp-H/$[D/DQ{!R=Fp*9t^Gg$9rs}fSfE_JT/+cXBhϢd2TB(Zp*K)H6*x:0Vˁw @Vk=kc3:8lSK!w'C 3PU{tr(W $&u5~#5zw/7w[ 9ɭ#WsT|bv KxT3zϖh}g/Sʬ$ܤtA"7Yȶ=d= `\ ֹ'R)B_&[\Xj/> M;y. _7PCC^ VްV˱.SDj:"rDVD9ְ&LS3A8gAhkRR 6mf;VdrRQQ9v9%n ơ$0Vc 6%Z]V,0p%vM/6 }qu;NZnoo_v7 Vß.z[.1f\ݿ~~WXe^53 3ZveˣOd fqİSB˗mlovWN(>L/my|1}c~Z 10\S]}QW0˛-6#q ]7B,WZw=PM[4%a'BgD3jrEIV[ñmPKlv 촖v|gx:Xǵ;cA#Za#xR:#"%M%.QEf摒V em[iKset\FG:WOϺ}SqkѿmH"ơ5lhkR;*:st0V ̫)Кdg9 7\:$bJQZ竓wZ lbt-~_B7u%w'Dqo{ q"ӱT|`upHlTlls q66VTotDըs4]7t闉{W5pXZ8NK)`>}q>?VP27aw{SYH1 1A8Ƞ39@9̆ tNn z\ j'?VQOVt+`4K?9GG [GP{)UNJ xֲ(Zn1' :HRN+ ܶe8 .Z0 p_n|Ֆ6X1c 8d>TC>e’*. o2fvFdO-7kY*0YZːN24ܹp0rx0%#[@755t$iQ6yp 2x2LgM4RNgBj鄴" iEVO?3嫽K+c(ZT='sӷ;O;d޶ C=VqVGXJzv}۫]eVVal0G_t/ڛ&gQ?_ϟ8aT86! `48' R]pphMJysq"I"9;R|GשŠwI;JuwHs^'W*Z6}(K6a}1]c ֿh8eUށM{n~mMOiGgh8,v{~ޭ_$PzW+y{%o%%fյe/7It{F<^m7Ouf̯ っ$'nhaɭY 6s HM @e0}W1 EWY\^IJfDOId^Re0'V' 2|h4;Weح.JsszWn_L :g`|_>eD|tJނ%N_;|2ܾ[ǝm ^Iuw4Q ] dG}:6uq4s.Jp~/!p侹xzfS~__ߢ/][o9+vNۼ_ a,v$ݗX4)VfjE=jcŪAm^P`ejoe&A#KSwCv*[oZw2朐Lo2p*ڢ:aJ7)L5խhĆGq: I{<5:y1"l!X%5T N x^gjyќO DMM(g%V D~|p݋*Dλiv0$;OdwxK1$0>BcB)ښSrdb4E23GR@d Iw!Q,S9,$?V*F6E(T Tz8&b}=|b)ف |ۢHF:dܖQ+=Y ӟ?> 흔Y"0^z?Ω >pѢrHDdE>&%:de+*N1\uLd=o_r<z3|O7Bta2ZRv#J%x" 29DS"391sY"IGDB )c"qsG6 1kL:!h|^J{b n?Jٺ0wXd=Z[kNF(Q^!a9HBc,é%13;buEM\pp(kc8( :.mԝ" u3 ν?-6Ol}Ns[.+{V;!d 6W% UP p{ W~-ujaIL =_:;.16]~wzW sPj RƊTCaDX(ŝ9K^ FZ hP@!hZC2xxtXH%~mʥA#T{vc'4Swuhu 4E3Y]yh٩&Lݪ[^^]wn!Ztvzx=ƏW̌{\)׿B#plql]՗6cH+#R/Ub^ٍ.6a+Y˚+?a_}=cdzjo3w5n8|iBl(*'!}x'QJSy'͢A d%f4 iJ瀇"Z *p7UUσ(֦}|Qz{{UňЖ4bցR+y*Q"F;NG[];9 cwVhԭ97o$[~VC%(f8Y:U6A !.T6!TIۏpG'Gt>œ'Zm#@z+;m\wаPC"pJ #J4b4᥻JAB^j+W?.Zph@I1b.i|AT, JenjmԤYmk !$Xi_R0mxps"ik#v5 9[Uu"EȦ{5n}I0aH1&ܥ~#,3T6 J1D`Vc.nWPQ̛[F$2H8N_lOKsK܊Fyg(\#>t=:Bp"gf(s9JafSmQ5|VuW󗩀T[G1[n(R"n0#Z LbƔZ.CW&$ #^J` \fΙYL ʏ[ ZzaQ* @&D]g1n ݔ'BX[ ?`uǢhu9W;^xռvZ H(eiSUЛ!٭;X~a GM <NT HBUHJޜL:n`^ulɟ^_]:Z!q-8gWwN fwΙ_\{+nvB΂pvU}]./w=Eg!LiTCp<0QR8$y*.WPD0d O":Eɞ1tNm-1~m[ ABA>ӂ&1\'( a]uZbFH(r)/N8oag"9g+m|'U1L6*CT~ fۺgGhRܧi\LVCo[d{k8ҏRH2sPpUQigR kÏcgsyEe*ߝƄ0Ԕ~ \"\G۱\J(e5ܴsi{ŎI 5=$Cp"Q & {-̈PMێC/[[zF ۊH>@CrM˵#$F%JD5BCT;Er*l`ҘY pBHgSdccrX)I5d*`ֹ緌oG4~P1y-% F'2&2 d[Ǒv f1|>Y!$QE(I\QgoE B!&wvHfc$! Sj0ߣMP£eBH JYCCg=^93^kk6Mt`Ezec #)^ezh{#C\֝Z$H9RI*Ӕˉ#ziɉE}|N0] `ֆ?FDwD_$98$z0HCTb+H$[YWotZ6]_]\"N ) )ES@4"@[Q,pE%`b10Pӂ | `D IH>UEe҃Yй|}v#W'MYBBp?4KY/|OgɃܻx;Q? RRF3c1 C)Pjb O!&d҈9!(hy zn2t_M&7 ?&~7w0Mߠ<RPgnә7a9.jB1X=oC1S2>&8SS`Dz^ 8Lrk>Tn%-erFމŧqD㚥 #N';I$Bx>*;K.bCi1Iubр 5d'w<0FkKI(8$GHo3ΙE&v=c['[zƲTAU]~sRg)yо]6ué sj!)[4U"&$]|RљW6z'=G 0V1ðYLjح^jB%[GJs:@85~FV\x٤&Xy[0)Nh \hż?͎#;yrⓧSY%%>V{Bۯ3$J UWoHyqe. KSԱ@!I qC"d9FbH"n>נ_ot8ȼ@ 1'n}\\zb!gayp)т%Cy?n-OI&dR4X0sUsV'[{)+ohL-PYy@ Q7|Y7? Q{; eˋW.%"-^X2Ѫ"EAud~;QGD=cE;pTn":c& SH;jY,!@:'  Epa&dEH5zXhVctRp9^Xh8&Z0 Ue]b$GQ^<+YO(U|!rdμ ϲʀSz Fv6a'uKSSZh Mځ ,X'J=7D- :򠌎rí2'SG9@N&eVU~%[/*#K??xܵO>bWYU)Kwp^*-!5na]xצ=1ldğlbm7#ruЌ“E*X k’jc]ʴFb*y}ޮ~BxߠAGpeTd4tL8`pN/qHuٳ]+WgߗH_qO}uv~"zXgwqmr# vdz6WpR֍i6Jk8iu cTګ>.$A$ِ8gDSܔ.*X&[;1S$RmBYdʄq:CrF0N "Z%V1I")s2 K%%dɘߟCЪ1.hu^ ZVryU[LiRBAHt:i|RGɎErvL w%z$Q'SC_4HL }yU5 mF&yy^jrkrWwNe1,1Hmr0tB-9#I,$JRuAc b!:>#Ȁ6TO9jXH,P>kq@k/Z$0ZJm-kM mn"pPlTjJB:N}?y"*TU ,+A@2C{>F2Dl$x)˳AQ/ ̡!`"q{ڞuP3N~(V moRdH3I lP+zua5*Cv|葅 8F60 4W /@qՆv-@\kIw50p~(HFg[dCC>u {r@f0 с!Jt-X*ΈҩR;&dK 2GXp|1r(jku>wKEWC&oyuQ|MTN޼& sRҭƬw=VEoBLeLՊ_vk2e=xQV퍺۵7@nCB{ooo>uRq]:j- >YCErqjhש/p܏NM=G*ɷGr3q;AGG> L;-oȹ*ynw:&,ys:fec2hR)`% LNϜۻ@"><H&8bc I7[B&ˑHgI%m8$RL8r2rbMh du!q)W$NnUZ-AZ4Zb*$ 1jѠ9kϪ%}Ӥ@8h364 =*">x36vd S:F0:fLhsKd'} $-aU& `W#CH!{Qx_F Vy}2@p/qGUgp$nɩ29-剺4&֌\nxo2me1 A:|p$qj5}Rse;*5Vp_N_HOPi'`;-ǼF!Wy`.WPLl)\J9Oq7qI1kʻKe+R7|Hԅ%57ލ]ulM.Z1.Kv+HDLPQ)X$(fjpcG{ޱd5/džs -Lisߤo>Q3v}QH]h/Eg ޓ|_3)3ub4].ͷVkUw!#|՗Gdu G,tj#KiO^f=M(t#~5r~=EKH ڳ'[VQѽɗؖ^ْlM8zK2 \Fh9'iYB50Y쯮?l)yǃ-O[mO[' U?/m@k{s}̓Y͎t@mBLN6pguX۳[UgX[ꯊUggn2wV)@E=lWTMRXyU_Foj 8*`=ŭ29%.j b0=GX `up)1ID([m-K(rf'-~]AM2s9K 5A\qC/*0j.NU7TkpZbON۶1@m@͟\uÇh0+_pYB{; 9p|g^KY6xV5xgm\/gѳ3s75 V, $F~QbV0)8ά)q3L (!EϳtQҡ_ˎ%↌o5\,zd4 B8jYMVj' G}Hda==U7廉--ÇWxL8{\}oerJ[d}eUgEwͅx ywWCn)kUz2]Dʷm;s?^_뻣OeEؘ,X"l]i]ae6Da6Q 2mUWZ1K֏93#6 aC\o3:̈́vjLP:N&|Zg Vp<27`M޾wMw.;2cbrKёE*퓦c25EG6ڭ% %X"qF 9{7\ZV]템qLyˊiݭyn44j>VѨ3RfF;'u"S̄0K /1JQ ɗ 8|(VËp!~upM =X6#sǝT_BP[c5V5MLr"x3f"txl*t70.L<.)]O")rmZp~~;wQBQ1/TJTkƔ@12/Ut K;Ot+}yW8p%blY`](bK8wjBBI}AD39 ]NVFN_m*h}nՖGPO*VTLit~g%iwgk%6ʖ߈ׯݪ_.|w]޽toueoڕӤ]V*]]XhKM-6.iNv(xT*A@<%8o׫K,^Zl YkmO^~ ռ屖T u`,Q 8ՅGdURhqesJ),Zq*Ls*L ot508S\Zms݀i|>3٠G\EfgHLV%e763m#K`焖qh1`\0(tH<5d,:2[;[Ƕ) >ݐɁd/eɃVڗ ր֓i?H&)mb4d]Q%4QtTouu0ff87;Q6 Içăg7`r&tDkDg<5 ) Qټ`?C,q܀JQuI4(ȣ :7{蜩xF9S5| VW('jyrWb- Dqqk>/s`t;H3swDAHtdJ{dfs?:k/8Jj3c^&{gVXX5x,4SѳdҘjpccF=GWgߞɘF#zJ&Kr9թ1)8gNäc6(duHZhc[c]RBP:gO^&wM͇wX<  P~gw¿YM雚'Y?O7EkOå MGeV+4AݨiL#+}Y<e7Fo$ u,I%_"H=elÞ_UWWUWWa# GkT qwsy(;\ A3s /%(fBk潷;oJ)V())h>cJ*T`Fts,@{$eY.1b:Z8-ODZ.ȉ.@dI6Q.A7S\HB+FK%Qq[AĞW)3,F0eq-K()[%'F4H4 9C,ޏa&5 ܭ(\.Ky 7r-2tB5]i{cexnǮNhDJ~]2͑MH[bwXqmaM012& 0^Q]oa s K\Ƚ:l-}L%}[\"l tǜ= m %OlxZZ6rhiZJ|zq<8%c : +kC)/;1t8{\DRT$Em!UB"qb}<8/sS.0;hUʯvMP1>/k"ġ ~[5Z=gRHᣋBO@b,81phMp&sҖH]z40Zའ۵'izȥD)9+Pa_*U[m<0gxJy|(⢸*.][.%7k"CPlэmY*EX4Q;_ś$"EŻ*7`aL0‡cC-ak<֊1oS ҨyЫ;M \k~y:b,3=+Sfr-:_FS&wWyN㽳qfh?fYl^"vUlfkʉlW~ݛ'MTZW*O$ZP K'KjEk"PEe4*`n.;81Lb2Lҁ#YT^ND90u u~ 8U 'p宼_S|@fp"4VNRB= c=m`$Bu`#L7ѧ^4S֚ŸѠwwq~>aHX"BņLdq``9m5KH\9TR铻Z&&b"ְY )PS?Yj=Vޮ kҍ|tڧ#$c]󖧙RTCZ";aiǾqK#i[O`N{<{fGjz>1 ޣ7>IIb#ǹ]ڝ{u ):$)A]}op9+1fgA'B1ƗP@8L\ D9yc4VJ(&Y*ALz^BW5vR{ߞjyRk*S.fm-I~pK~&s$L:ȠP @=8b R`UZ`_:)<@F'0))];LTڝ(¾.v&9֍ Քx-~,f?&V17%~43^\l?gcj_OE̝~Eϋ8 G='GcO9I^mZċoRžRmKvTzaRcDGhLqv˶2VwH~ٞLe-%z̥V ?LH#>ԂX"R`,@Vb &ae,#΃+UZǹg%"КzD$bXf KiI-cFYGk_GT(x2ۇʦ>#_c5Xa"Y14ɷ3cQ.xRaDKYq7[+kxLhxOU09֨>yGt7 J*Q`_+$t5=W'}5 &n fw({ٲͽ\_nFÁ=<PgΙ`tTAA~ԻL}يKdWZ ʁ|'uCvHu>4rWrs7KÚL 1X 2t*-j+2E}H5a#β#MW}㥖J}/*R*1U8i+9aL^1DfJ: _!6i Wl/_6:*OݰNF,&K`Mf.R4xf8 1z֑B;R<׺@{[o]C? 'CƇ=L&hKW0@mpGm-^?+&P`/vmb*&U4c*jV{tn~u/={Q|7;U< ܀}rR|-1}|a1q|d1 8gzweO9 k*/G6=ŚN9 f`1ip*z80d7Fv)G~f5YP[Eisn ʘXӛz6TMVa0XpܕS+!n9ͧ1lhc0Ɓ5UR0)BadƸϝ/FKYMpI TpoFm8Lw8-1$-K!Vݐ%bN<ŏ艵A+n-y!IH>+D54qI@VV#,ZC$<塔ic=ᘮ} #2C0 *0 U1pPl{RR+p,+A^6*&UY+cI^x"xcgKJeI`E)f20QwNAuSTi)І :6JX-b7gNP!82)*5bd_K jh$4 Kb8-V!멌L0(CD!^7*nꐽtׯdua]f_oV.t &4L%CjӘ79N#W7fEO}wwx:<6Dу$|rR%}Lb-B[:Gێ!2do`E-t|X4Rr}t1Q ߍ>-eYr+ T>!$n&=hZ|sFsIP4<, o_ӮKM`9gka5m_a4(?Ejt7|3#jD" +S^ 5,oCFƮznu!F"[ĜΊ'"Y2s6=Ѣ;I % ?OkoFvm޴占jGyqRy ^-}~s>lËNJÞiJDؿY6.{,%rI."D8xMJQhJiD ,O3V`\rO6s0T$kI#Ft@b5νlV%n '},Ƹ1܌i1B%n(A'x+Trj{ʐѰ=8lG"FL 2p$ER nKpؚ èV:WmTu8NU%Yٕ֠K[#>TsguFPM31cKv;Ps.ӵEU`" 1B[o5 ez- w/ ׏6v^<1Qzp'=ԃ{ꎺo`Z#VfqnG[KwͿ.TV#7?w^n&)Q锣ẙnk\fzmًS:{P%859c"[Uwl1 O`rj-!d!ưXdI\: 4f%RJ8ň ̭$(s) ![6|Tgz㸕_r=w ) v svXYْ%9Fv\$Mn{Z yzů*.=0(q(8\[WV MtZ9c$=2qQ]}Z;ŹD{2FL^9oJBjN2e\KDzmױަ,XBB^Vx3VAԎEϞM<5vhvk!!\DkdJd^1bj ڭѩv![DK[ y"!SPD:nh] R "fc,zes<@sRAErN=NRI@#!@p#Kr,5rKa$q*5Vg]SG[2 ѨtJ>q(e*ꕘazP~ 2DQLzٞ1VW]1.J0a Z[EQ<|Tt=-dƎ"PͫNˠ ǀi_~I[~'Ԃ꓂?iN<J<*Pg?gٙy;% 0ZgbB됯;^P lSlJ ds=*Q|a':Ktp..%e~w v(+ F(U'L7߮ F@!*ƙ e! Q|O"I\ikDF1fBW80 z3fJ03-2f޾rtJ^=eoetBAFI,DzS?E1R_~Xg)CN$L(w2 Ѭð9-=nIjeKH.Edhh҅s)c9Od4T̈`uKF3>0g%AiaksB!X64PTh-θ 0AIe% TbkPp,5 ӌi fM^'\LX+ꖏ&UoTz8pT\i RޅNmrjMC^ 拃p[UV1@!lROyGU%2-hANaa?F{e̅K_o6T_"gqhWK@ذRIՅi~>wuַW^M5I<7f2Ġ ctibpLILUGkbP61h!!\D+d*%=K}`ngo=R(c{<1ƹ_; XG#iE&9‡j&Vv gɷk"DѮw_@3Ɣ!gfy!ۇg 콓V !~rʦSI"=G~w 1\/Q|E~/0D _ݻ!zzb.,ݡCOC? =θ?${1xN8zg\M;U&+TZ}<މC211 e3i54Zs[y 2Բ4Z+'W[$T܇X| v̗@8j_(nΠ @=;$C"B1(|SAgOh QcGr/ ~c $0ݹIxc%AR+;;>\ >G".%ֽw]1GJs2O!8?M8D[%o}i`LG"}abAމ eFdC?o6^KfV# kUd!U˝Ezt [" $6#ا`6aNq!̩ů̘b崎&fP[?]Bf)Ǚ[#S,~;$3.X(&yd:ffH]J,6Ȭr_=VQZr?D}l-Ϸnk9X¢4Mg'01I)䵬c3Ц8`V71)Hc3I/ Np:._Vk<d̶޾ .v]vt'4 wFf'!P=cLrIF%f1;Ï@J$} π*>CQWi vxߌ ,ׯw|qm>X5_;TE,,y$݌i3@͈ԒtJk-XcOƩ!sفWd %("P/ 6^E7ioأ2'q{Tأ[QtDyDfXqѹVfeP< B2XX'Hx2(sTDUD u 8]ߒ~k ;ʐ:-SFc#`tc0(`,O b@* P'@)f X@B ڰK+0TGe4Q`D6Džc4yAImà<]4+dRAk쇠tU?$: :wD1TW Y-%j҄xWef ݄ Y y"Z%S MBo[ɬ[-%S;FvKSEhڭzڭrT-YtDVAԎEH&Ƴݪ'ZZH+k'f8,3"X*6924 ‚;ιd42"ƒdv?wq4kf VCu4$HXrb|%.H q"̵@sjWqpP|݈Y;8QYʙ%Θf E<qeƜ| ^V1BxH(]EPݖSf9}Δt#nEXsɔ0agn};'O@|?N8oMP`^( L &ïj ww"}MBR 'Gf jd6q' b dZ|XՓ֨t騔:ͷW@!8#tBr5C?i'9sA3Y|ǘȸā~XVq}MaW?!Na'*2^˔Bz.5D`-aBR8U401$9"ZJ jt4QooާyZځޞ-bA^I wz7C\|(ȭK/XD9RɷR /*~D $_nnn??ge>û7黋+s"ApQo*&/('_V@ɗU~ fe]|'U ܩE֫"Q-)S@b:q!mESAPg5(HF04 eJ8'́ vA-_$jU ja"vpKXB 89u*˅ mt@VJl (rE<B'žF"S#Ev E5IB˔~:҆EwflGO/ pć?[}S$>=p?7񣟮:}o߽3DfWXoDa)67S$V>kqr 7 />㥌Eի$~hH+.RA;zݝ d^odT?kh?T\,ׂ`q) ƃQ.(k8Z>H ts-bĐjs J`z͙`Fh!%@4 kN3Ι*leВ$(8 l8b! P Ncky)Ƽu-X!ZGk 6!8ERq[`d(b! )::"iBqT?5d^e:$3@r)Y<5tcTH,*ؽbb 䜪 ?BYIJ2ݫ"VA>`^lCG%q2xDzgjb]q'/_o^?YrMEyM'2@Og D1u _Žp-=a7P\> u^d#N˰qiZ1p3S+1c Ja4(fz H,F)`C24t!9A4s."@0'͸R~\=Hj#4? $1fdfDI]jI0Ў+*)IF"2梲^ 'eݟ\N|Nv, wY]KkdQ{PZQI\SڂPʺH]' "Q 9DL22td 5S6[lO^wDL uX|Aan;3'LrB"OÒxQ3ǑC5?>Oe/*=dԳ̾C2 t('oKA3I?].K5?34F;}/ &(cOA"!j)Ts$=kCCƝqȕ!-B&)ױn;D?KʒH^mtE|PxUO].չVTª(K30tFyΙt(Y Y+RgE0*2ǹ;%Q΄%Q3Ѭ ehvbЂ5VxE~h=^'m9@zh!`GELƵBA%z>ucƝTP'ZuR?,;2ZxxU?@N1ƩQ_R !X2~lC⺤d{V2V8T<_%=ڛ-yJߪA,EKaqKQ_9gMx*SzCR\tb=NKU.cx:/F͇ hDZP9:fPײ)?Lo Nd.@'= ɂ.]N<s8eaT.9:??Ӿц9h3QOcsмa+KbDzNVf/2 zn|rjV\"6g ?oiukTaYZ<.ova.OጌVM4R?Y\p̅2ȩeO2]i3ϴvzvb{{Erfjue+shhO Rp(| 35S1{w;PNt6˛AGa5 _9Gˏ4o̷|K>L ze&Q#l%4^0 fxlZ`V`0gF eP­"BU$0s;%+8*oV,m'j%m'[QRS ( M6obG }/..ZQ$+W\Ҟg?PHF4 f z57\Ut/hXC^vz $.p-rc3`SgAC."l  OOTqW12M:3 |z\rGy@4k6ęfQԬMZ:IUǏW[ p%+s6 mӘ6 mrB[ٕ.0p95(SBAyf=#Kc=gj+Exɢy'2m'jʴ| @k@0[yÓOddx7 _Ze@yA'b?Etz]k64I?ʄ {zd8w^U%9HרQh ^&{WgI=/1l|wrœ,g<+>|;fgc4VV+U2X%a/rob:?[WXk{T>d]tVJ)o{݆^Vz&p)Sp8\hP9 '6h\a wʚǸD i!nC %4NcB4&4NcB㴜X b(k!A<SVpN܂m!9vVֶXXZi~{ʺo@o0n>Mc<2]^#:E߯6ۧGfm79ns" }?~;csH__Տ~]2dFXj8Z^,o);Oi5"l Xc6->ns|>XE[\hqDoTό^Z~:=Lۃ22%Õ\n) ѻBʭ<[\]GIZ.a| *Q#3{1*pSbqR_/oտ8ͯ-Fq덦 14WBnjB\58N)$#hQ;CӖК=gBj$7s.BZW?f|t. &~Ӏ7ne3 1b$VR3`PkO=iӟ=k*h ¡A;Kpu,P~!hi%KϾn9Q떓- e{}S*[Fp粐B/XΘcxPABjBc?/R7xBu :z:J@h)m0csڜ7,[`oϓ.h_o NU(39sQ<Ŋ"JplRԅ MIBlW~6bm1rZr;+MNW_R!"1,$pE$Cf5*`%{ ,VjRnrH9aޡQ:)9aN*c GCR܃'2 xf/|@D =@-,!/]%qiL;jG MjM(FwzvWuWJYq&dTڠ۵`Qv {HI dc/jR%A85znՠ>--@z9 D Mfbpahz3l+LCtY e8d9%J&ZqN -fDj鶠ģe1!ޏEQ ȞZYQ0.y@=lsCYx<lN4ޝTFX m̱3.*fA/ 1~h*.<CؕAstGƃkm$>5X/58ޮp6 [Ym&1["Ug%EFvBLpC<7Ԕ5$Hsl%!(NNXLZݪA FoP޻m- )SĔk }8LW` &bDYHD11&/A63* E'@6;x;ƸL39.J4i.oՔno%.==( r ՚?7cg.lvǩq&Ӫ˖{2>j+RǨ)z_n`mה]W7aҗ/UpbsN8T佳r]A]353I.)2Nn>햊A褎FsŐ[rH.I2%Sz|ޕ@ɜܩ='4)S Au<ua,ap@w_e,~xnaqB[zQ{P҂2PCB>hmHʄ Ü2Fa/A=[TSeiWa2iVٳR0_ٻ:sӯ_ۯ;*tIEqeɖϛ 3Uy)ãUsߤPt04U ! j}Y9V {ܖ*P"?.{Kbx sM'x{GL^|s|9٤v̺NO:d!MjC38݇9mHD#1t{\Y (mI8Z3 :?nu9N.P1 ͤR۞"emxq'Aq}6W|Hj8v0·c&bmE%8)S-y-lci_؟ Ȟ?ۛM ?q&gK3.wGnV^޵jpe~9>p@ܠO,bG/Hj"W%^wۮk٬@-zG{#+7L.H<Mm+U淎2`PoW^ \{5p|5Pc(/@BUlK %f¨(+KD /k5&m@IoS,L@LH-*cg7VM7.B. .PYtd a &) 4 {>OĴ1nA%{DrNY! zr'rfS!B 1*kE!ȵ"`I1)8].@Tl'd&xuCql ʑr1#5%r(-A%qbn@*tFZ̘hA!=F K*Vj ~EE!k @3-pk(K5?aGJ.ˉ[fB_mrkIJp^ZT RA#PU+\ l6- n HЧT_'wnp'd~GEavNH2##WBNX;! XJ Yosfѡ )'obvKob>\[=R TL+}0udQdwcB4Xy=7QX71Z& Y1r d},y[.' ծ!w1[t '=fXa-.L|w}]&~zlwMJz8L|MF2-8* *yN,y5&'I]IȬg`y8͎̼q%O/L0 ܲBA80Sz>bYa~p`gA=Fݿ7}Da F':x' ʁ0)ZK]~y<]:s\>}}%ywf->UfFV立v]b _~'ISVen'|$@cƺj]`- (}=%aoy,eHzU␡r^Jې(IO2IÈ<]J aJ*X Klݯ3j $T¦BN8O :lU9YdS-B1*- T $N/&%JVL!p kBzqac(X( JCն$@*1r\uƘVc 08pB`.Sx+v"̯}ʹz9w{>tZ=}"0+}_?{B?<^-|+p=ۇJb>=VQn9#aRm'T-[߱_q[!{VI׿w>WrDeKK8{ 2% eQ(רE5V*1Shf@ ק^cdhLI>=|GQb#:cnGONH}kjhwrt#C"SJ,QLwPd.w5Z74h1yJ61 3*d ^+M֯||TNs;L9MŇD0ƨ!}vX- iوEB~e3&茉UXQ\.*(،L8JaPCrmF>~p=\ :S{w[ۛ+Lyۡa`r(ۅh 9۫bﳋe+ &ahi5 *  I&TDDE=29* +9DĔÎ^B_/WEڒg?[CD"Zǟ,nz5u +v5Xŧ_.U%<+C+o/¯%X]k5^+r XyX0Y2XʌT"^HK%![@ @&{gl{OLSujsn0qJ d[#(PbO {"ZƅR!"YȕqJ .J 2 Tc>R[7"b,GԀ$m/Pp EPP@2CXiLa0QVi1)H)2̭5 X jteu]Fa?RH#[+9#v YaN)xDN91ni䖒a,n ¸Ug+[(@P(Iu>)`Q9X1DPzv9a4U$&U'H w/[!M1$=N1b˹G: 1.j@ncK(覺z~RDx]O ֩u~󗥚/Kg\O6!ceМNj;{RWN5?źm~:sz(B$XѲ(],4eҺì6;Z Ub5poH̀aERr,e vuW2aݟ:,Sc |39z&(:Zv,xv0p R9{xdN{W?8 /#w:gUyC>w>l) ;ڂ;sY%+֙fp@舲.bzK'Y{|7\Xް=hN<c0dHmBƈ4=GZNN ϣ#mnbԘ36XBͷ(1C_~"`VAQ7V滞bDo{6Qw&ȕMԂ -w}dzPBȩ ;% ɴ a-B4Kr-=BbB'>;^L٨WqxL*ҨΚ\52a\9Z;_,ְ縇qg >|R1 i=̙>%A!HϷy%1,(#o;wVh1)O 䵑"jFʨPRC0,JQaTXLEv$:S 8E ?^1!-}rV"L<rMX'+TC(g EAE䖉)#7-nYM&O)C.Yd7Ҽ% >] wlY'"||d`$_VO&QIqD1 Foh*CT bZѦ10(w;Ȟ9$$ .EOCLJ sU,d(<*NeXFW)L_FhL>\ED p6U콶[gj:Y<$ g;lG\tzv)񼍝 s(L~ @>DAc.`1Pyun`Nb;n^\A=wNlm'K}` kJ`$ԚR | U`H-*{3in}V1*O7$TҖJ(E dZitEqHģʔh,$ !n ܜ(_R.}`bqP|wٱ@j9a >"RhB)DX[n wY^51q玅ˇGsKpo) mm gz9_!#"Xޝ!٭(3HRxXbQ-Kb1+̈#?N~Mh򋛇+w7lKgwWuLd)X=_&?㗱XJ=M\^؛1v10 ?RZ==U@uB-bۇ y߉w|rJE|=MzO7 Xig_nWW4 ޞѧ|7a_;Wnp;+r#=_&[E=8gq r^~[#-9= (QӇ*t=$-ݺ\CF-d&zSaJkpǞ3F1J[?bgYJ|~ҖpLyF#(CO5 9U]Mۻw5HFmHYm/9*^d"d/lB,285 GBZG>̳d7hze`J!FG!lG gX6u RF ^zQEM<frc&,e Osҁb#W l$|(RFm>Ng]_ 9ps8%Aw>V`6/PcDRUxxLj!ԡ0;ٳ_Vûo}!뱋$Y-y<˧Wv1lQz6<éwZzRwW!J?{QDA-pڡ(:Ĉɓ + >mouSu3VpNe)Fǀy"5 &Q`K-"DVYxOk\%΀uj0/žfYa WV/O 9)m"Lr䐄 ,ȣҊFL⒰gE*%!EAj`Ȟ=HPJ1wgghYgrgkޭK!C#g ~Hipg9F+.xret4uΥ=0R+$+컎bW#B,3LNr 9QR u *ةY||ji*I[8qWxZoLI]G0ݐ'^giF-ݫt<2%1q+oTh&9=.y.=]9-:A 껖[iǖbGQEx%(icl4/VD%@%%TFgϿ +<:K$<1,{N[)]$!|VРgD2Α1d 3He뉭;uǗ=h4-C|n.x9ˬ&g|g.0%K5.DL@%1Si` m0<2"l}NHaf E8A7ª7ѻBo{./ɠ)L?;ryhkƛ˻3aJ wK 1y\ḭQíPZwO$.|yw}[<ϗq9L$w1_g⛏Yqon~~8^{,@.M0vḖ;=f%*tB{evdHuܐϴJRȰ)Y1BR?%2%Nt*l2zrqui0!nxni WS7<Ս{ W;7llHW״0JA 4!*hUA1,>"'?~Hjᷖv+m% Z/<݃LH*[+>SoOi>|~sS)7Fk Jj!K{]p'#fO^ vnfp>n!߱qWGk$I`3VRZCŅ2in/`CTf]u]\_7=҄^seݻUGӅyJR1zX(V+or1Fl ǤeOzVwrx76!cv mJi9wU{+^ͯ{õ})]2-PH{:=S䁘ڨ \͖t_> .4;vزr 1tFNxyheJI!ʒ%Uq` lxlYcWL xꈗtVsW<, ֋9k=[}ϵs ﱌM'3 Ю 2`Yf*F*ЬX&j%B>圹P*9/riAD5lJdͻߣY^2J{h֥.7~@儶y܋_Μ&"&,ބh]vtp}iѾLɧr`?.gl9 [;)t»x!T=G.bK˱KG]e><ӿ莌ft~9zc*4ory .cJåVE;Zg:qCmLqAQPzq8pZNjL7jTVvŠv}G@"`# ƔVn0yLн\{삵nP?sf2_nܳA mko'x%BZ}*IH.Ret֠a;m)q4 2-pg".H-JSX13imUiyC+b4UJ%ngg j.pc*f{ZZCgJOnkEm-i.DcO q tF`f=%S>1$EQ3۫ZTX) D$RdL!}*ڵ>k;3rH#C n.gΑ#=4E{ز[C}(@idE^gkt:wJ1{:Pubݠx:Pu]2%f;}1̲BYZȿKoHU9 $׌ @fQ6I^zC Xȅ>҇" nn8ZL S3VmkpiHhj>Eke%E͙3mHn`ZZmZm A^w|Fwv^}N:+ g4km/]|?S^h %W-_rF(iF 9-G A<0C9oGA[#BcfA+*yčPr,#*F.10*&h!$A S[D[Tq󥿭ړٵnMT}YUxWMh.~&e]*)}⎚hM0 Y⮼;8_GM> |ω{+x3hr{㳽`cԐj |eGl/_^9fL aZ^Ay./,*/3TpnpezeDv m}bo7ip]Ztφ$˧y'>1 ǰC%2ESԧiy:c*e֪CN؋ji !˞;FM"yq9c\˲{g`5J\|dT2q'zH%}? ̊Ds3N~Fq͞ јᾪVU<ڻo} l\,VgnIџ:P7vv, E@OL56h&KHؐW h Oq-Ц~,N~[>$S6B)7ІN]̹]nvs#sڑ_ $9 FhIϤp4OI8[L$ ?$x]{ c7bid {1n: |)BavTwhjF)%Ms,.M e.gmjEz1Ps$y:ͭ&H) i#ya =eE("=j{y}lhѧ@gKun> ~ys_%?諢~G-{7 Z_3zT-мH=-3rG=v3l||E\8ȓq%up,)g S˰tP]u|6Ҡ zm~тYNjcEYX1ʩPAM(J_N*nx)l5pՋy>_!^4o%̩ 7 1"D ;zo/$p#vӺrA[B00XzFR۫bp_}`까G] 8Jfpj~2X%_ 멵q00Ȧ$c4NsML>QD~VjbDRb+bpeF ˌ4Ä дuK5c5+`{=['r|,͔ہo8\>*biTɿo?^)oxee3 {e y/cl/zU|d|z4)e[4.bYX`ۨ('#uY9-3Wpzʥd;vXY;B^_S)NtZ[892Em5n٪q.7`Rk[{ ukn5ԭ{ O0J0De:P*ْ45QaPg)$*ALM~Xd3EjWQFvJLBz3[Ȟ4Zou} ((R,t=W0$Js$0S99N*@I2$Ź4&@d遼7B8-ʌlnTma6.-œmΑ]\6yMg$v\C=8G /ao; Uxce ǓPXnH.1 z 1NNd4"y"G$RnӜ%Hs1ddD J*-$95EF zאF$x*>kb'q*PYN9g2Jr 3I.s4$ qC F2[GU"T` iqR@x!2ed$FcPL%Rky`3Bu_%\ " nMNCWPmw)AD1K%@5{+\L">fDQ0]q,Q[F x^J¾to!%)Ǯ\֚(nҳYXf܉ f )B̌rDc@HEuN<7-if&q5'SRصM'] vABBܒɝ,zb_ }ɗ%8)9Me΍C&TLeJ\h KwHxssC0S"Q2+2fzKe~E<1*4qrUNb&o1 vnrZ J:c*7"#| ͰY$(dB?)#eTGVbY}c1K|\kUjlR]mXk9䥒184Fj4C3+^jtŃ=aN%̐MUC`P-+ A)չcӽ : @ a=x>Z}kdݚÃIa~ *e2MCq N>n,?#OcVl[WCtT785ĚR1f(~DCWp 2R Xݘ;z/A@Ace *zM]ؽ?`O@*+CiwzWӮ_ԃӗW>nj|˭7 L9."_YTt_>fgGcF[bڳ]]>~x\fK?ZB6Nu;bh pն([֏[;ԿA\_~Hp_3O:Y_1$+Uɞ 빯.q{wדfOw_cR''d䩄&+ޠok002yq=jdDz)>^)F2]Ve\'EVأM{?+Cd>5DAŹZnc(Z~wy&lY*-jY<1&!(c ?ts퀄!xT:ɪeaýsI8Ꜳ3=AI#x08^t:{)gJwP588Yo i0—#>3Ӷ-sQx aC,B/.70%2h{c3b9wTP<")L+M:鸞h"!n|u>jsbm?Wf'[܉gxRNaHtKՍ@%B4ڽ!?f1> zNC.Pu[S*kX۪~l %j>S̀Vw+nK?%O[^ ^.^-aWkIW[-fG{39c՛Z߽/Uմc /F G?h[ O# @ eRұ{rnĻ`|>/1\k^SX!|)2XoNp"n,1hn O^q.[xOc%^eh7jP |NMpӖ?bWDۯ) }QsҩJTj$;1Nq+tw7Wv's$ox@!-Ұ3^Oա ~QUc`Ǖ, vE|ۄ-?kލ}CE /+ p.ztkh95GJ_ Y7M[3Oٵ%H~*7J*fX٦u+0͠#R]AG l[ǐsWsBpS\fr q+2R?Ϗ<.nE;8^4ل8^qP> $½ C~buz'D_~{al+BB[62&)6 V]ՄSp.:o66Eot2[Rw[ N͏sUfP8 ՛\4$?@Ul8ٻNld|TlW801Yz{u9Yӛ76ЫnfewAƶ<ٻ6dW8 }C("AI ZȒ(;]俟!E5oR{f8eпN[EiO~7ò)r M!?_]qj>DXY?"i!.5 `LOXAGkJ#J*RZrRG62&sI AWJo ,˥"klBAò·C2·-1cxݧ*|i-ODptdxDAzz)s ΛUo ,Ga:kdP+KO?`LݝV,*G]!6TrEbwp)$#x}]cZ' shwtjF>rJD4&S#"p+r0fW%kj-Y%"l%ݝ0/䐫eQ?(RĜ=c|HNYF,s<`29JE/vgZ!.GQXem#|E^GSmQH੠G-Vz,Ei^ DL\u.*_kn +ٷ;/t-rQGُ灈RDQ}^=E\dwɟ-MW䴷 O5oC)zs{';vyP/~WOZl⎽.4/8R |dVTP 2S0ֱ1 _rk;jl,yR 9~]r{58'pV&ذX_>GVC{Q7CEg5"Da95~}U 8<}|=¬[[x8ݷxyX*uWAR~e:/QD-]^OKp)CY^?w -YxoL*8QD>`Rp, ! HF<"i5* R3X?IA#j/bH"-Ƌ|1Mph.#'RkjNHkARxyͤ 4 ,LIB@UMj?XXAnˉ*oM[ҧښa_[]8y=C:踈^ͲKBJ3,>U&/w*U nNxD.O@hCiO6->N>f8sX`-JrG{(S2f@ie4h'*8eC$SD 4a`l8\#"R!1FHY@"3-bZS=@Ë+ V`:,اH1HZE^@R 0Xe}ǏU\(T2 +#Q B-b}Z+ V4pg*Rꄰ_jbgi !E Cv\KTri-,O3p-nx^6d>?'>|&{`޼⛻|Kwo`b4 |ɻ?|޽IŴ5A[e*?]_]TC%kR=~{_0zFa |8`|_;^O9VY뾤gv I4HX#`"-*JNQp^TV8xbM*7D2QhL#ش_L<r~svKcPJ,0?5/8[ֿ^Ǜɲpg7ζ\$\$\$\ɝmϚ>K1ʧkcBr26.t7^Vd܇ V#'tO q!8r;|2Py{Cq&e'eKۧ-}ZJ-{c[ΦH{;dxzƾLٺs"=yF_)q2)<]N$2 &:긳ғdpJda S"4( 9~`jL5Pejͬ׮gߺvƵ£Ӎ SΔ/ lxlpn1Dʠv-WZotbhV G?M2> x] \1o3]grc0I"Nr7b=QN8%:k\4$4W y"Z%Sޒn*3%j1(1h#Zǂhn-n]H уe2X3C=F8R1PwPI $NI8 fdo+R#BPHFIK,Ig9m"Jv4vH{HZ C&bSdC&I-R)A;2fB0E5fY5b{҅ޗF2^ěx&5%LqJSQF$#[#3FfXa*h) ^J8Kg6q}2S Jްfg3L |u[2 v(yR5 փYy!v|&/x64Z@ &UCtd+k)վUcs0Ljmw::r0{clDvu2hyV붇n[L஭0pRưxzp&4ՊS<57{:fzi67'xE,TXDAFÛ -0e|`igEq탞`Z# 3,W$T]WlΚ V(NxLQK_rgW39OUsk iB~:mCk왘ZHS$CbN_e',ت++A56S{U qQىái1DvD鍖"%@wrٔ ;]]p;nԩ{u`v>e dU%2XXQ{I$M($L%E,"b7Wo|pQpF^Zqkí42?`Nd'_,0. ` D%Qzi NG`q"d386grYm]dڐ+9JԟM$D2uafNJl9q9m׵[%1ETׅ[u:u@e}ZT/ ,%Ҙi\@V`GJ ܱޚk~ ipYa}C* #ҠuTwH \FfQDE"jF}*nh5łС6/F/Odt}قk^D_ǺΣ/[qq!yOe jL 7{ۢ`p楊cb޶w`/v<:8w00 )"ѻϷkQ-%bjhSxSA`/ u!!/\DdJџ@Ơj1(1hyrOLj.$䅋hL H>nD_}ӕuOx] k.rW+l^_tx^x#WL$;^E N(y2AmZz:"OY3Q#ƻ+ Q 3~4Y#o퇢-{䝻t\g Z+ J݃DޥYKGk[Lu#¬Idj̰\~?痿] _t?'*&dMgc`Ff'Y6"یŀoBgE&.&;w A/8=F$OoȹA9Gf4LhVƽz&G͊8TO嘻Fz4T*]W^|Ȟh==iΔ1^1w٥B:&0Dcq\xq,xol:Oi{MLΤﱂNؑut\Yqbp/Qb_7X\_"cLj΄v"36G 1bC,z.D9Nm!ǀ'~x d"DO l;;Alá}53)=jA-Wsy'gE!<{ݝ2m R·7n }52KJ^_mG5I]K_m `4zzv CuT %'T(2/B:n,K//;zT3xSHQ~6i?MUMkN_ۙh\7doutsqS0C{ ̱,UjOKׂ a-C}*۽Vrmk# k#G?D έ<'58Vnt$Iߗov~6T wwj^XSKh1ogqj6{ %#F+VJ) 3;x-("?|ty4Fw8:q ?ggK_ TnvvB{a]B|X[e=2ܒh0,β[G-L(W! Rz .,l$2+n~eO AE (!pYFtzNT}eMKDqw#qG l%>&ю 3)"gfFfh]{5fr#+ei"Fӕ-=E[[ Ҷ{Jٷi@e/9T" e4D=/8/('/h*  Ro@%PAPڹ&Qݱ?AB"J(FRe\d-XL "Q Y#[UqRgr]uָ+N1i~EtTdu6 V94=a%oMڐ6)VWGvB5 GtQGyGr/4W!!qm$Sw[㛡S-$iXNFHhˍZ"-]$mOۑ'tij64[ڰ[aVA1iיz1 lnW =jO͸Zn;gjVĩڰ@W^tXP7P@Vn=HD"pj&iWF}Wx0mSߝ-Hr-?*"5Ph؆[X)2YJ~q㷗lE C]WY1chݕɀ*H%MъV)fc.UeqR`K:?YdEdgNV|@Йg-ȩኳ}u89ODk"fo^`n `Ϫ*Eґ'븆:#ܨ K*>:\faZ'ZF .ZLJ`}Zr9XB p+Vog<}qI{5LƋW{E_!gῡ[6|[╙ X:EB^*JCk}NMWOsU/("ڨ[~uo=Vt[I:NCͯԩlǁgVt?턷SK3#jj411;2øL0.0펃v6(K@R d(,~D &IZHՊ@vM'j;.SLPA9orZ6I|Āe:B{e#Ɏ\&;rY#սnIvZ.\@QOɂSO)"N <{7^o:|Le#%0"NDVQZ-%}#>J{3^8~`QxVSF@!|&Vyژ)GӒ0BۏwX 0թO>/(EAkkjPD2"ySpF. )wWz߆҉ۚK/7fͥD D>h%VHQ4顜p;]^DsyYs[)6S:?חOwJP-2ї2T#Ìލ HUԦhdB9FPI`Z0%dniS4U~4y?PD] :de ʘp߆L) ekNT{Rc2d(q| ~Y?_ !Niu%饇XyOX̸[끬8 |GoË4.>UK ,pPo-L!5 _4)Rye29-q<Ӏj-DL-xxPp V^5e HD K,&["{YD<o9r͇E#ty-/Ǝ}yc,w]H=-)a_i.e+sN&^IBiwjsQvtUl R)i!ZB)]pcωe` 7 BΧbjYrRǨiӸT:ROV$-y9i {!Q}NLRNRzR*h :b ѽ稾XY)vңR^^{YJq#* -NOr NE_$q7>@n $~yRޢ j .mޕ{s I}4F vn76/~~Y7J/5L":~=/7=E 5!x)zh},Y!NJ2\{CvȘetShދm>ڳ_?(ȟo-kr8eWI]U8>ɶxd춭I)&JI6AZ/|Uٽe S;1RT-T>i%@W"A|420:" 彌}X"LP&qeIFL2m˴Q/ZMs&*+,WEd$HyϜg lQJ)X]D6|gUe5 =zhSdWrf,[6ƈ-&jYh][ Rą( M'Y $PӍ/ Ϭ Aai%Q6R|ֆȠ6;˛<)V/Dè0t<rN }/tbTt  _~wzQE8msop՟7_~4qrzݖ՟};?qDVʡuĹ64VnwNn&9.Qi욣;l\wN#[ʼ$}aB2>)J񌒦z-?}hlA]NQٟSX DFQ-k==,e|2$LHꏰ \LǰZI1Gv|˓)K]3r,XuR?,uފH9jǢ@&"ZE&BQɒ 5Ӏ"pN3W\ 7k'ש6Tro~稾X:9]?3 o]}H)'%ԨmRPyR ԥ z!ʛN')=>)5isܟ59}?RJyRl'RT3zҥG.YMqo0{!LIiIdpIJQJ:_D?t)yR.=n)ͪER+:i腔f"YRm8?IK)LAM_ J@JbjIfUr,%fUr|:U?IQKiVr{!Ij K)͔R:"pm*W`)9]p쵽ާwO0~]j~,oCߗz^=5JtGapCPWMh6(ͿNRK wK)a.nmH[@hB1e+~IiL#}iZ3-XICRŋC !CcL|^Q1vMA O3'P'5/d2$2]-CO.SFؔ\v}9'7L^HA2SMa[!i Ar1)N->/>x7w)є$ibj7WI6Q_-Oa0|Ek%B?U\V("~Oa4Ex@1mbC3"ZJt6i-ط)#l~"kwbU &}P;GaU^*!OR@@wKy 5ZKN,0ʌqU7# cE{R- n`&cÌu*.]/Sglz>nBiqy<-l݌ߝiK ل+7NXBp6cF5韻F%1R(X)!B蜢VY_ 츰%A KCΉ8m \KG?j!tդV5;bkZi"ly^ERUHBSjUAd뒝/\ȂUU55.@$|W`/WvBm#FdQz5j+ڜCj- qT+%h*N>&@"Qm&YV&j7xr=W{j_h9Y9Wrl!d5T@9Ub-pWfLt&z+ w;*A,MU?`v5.zIBZ&ZjwI5LRB:R @ {p%'\ u?\1 N:F*8z'jAWC*8ū*z$zM!^S&.:BAG74ߥR2PYeoA YI#xC,cbFV~AsO]Y|eNړǞ\A U;Hka>C7;pP@5y'aO5iʽJZzǬ_'xQKp4D(F/AUgeҢhs 4MZZh9g]u*VsA?0h_S&Kn1.#XT0&OUUVOYysz %Cj+YУf*ѹQwM)'DeKؔ_iJֆcW#:7 `I ZW:j5sj49Ǧx-fr,GTG #[qG]4iR%jM;n9%uiTMc*n-<*=((,!T5C"oh:kً17J-OQ1BsFu[Q%Zg=58fsJBX/QH)O .p"qJ82*)36;)XK)a.ǔ>@0{+Nd‹"?z5#_auvqܘGhRmxq码\-Weu+rηZ]2 $X:bXPe7(F2%W49f8k>lC쑊y%Lq7\:zTH3yQ\{{,:Lm,Z#O0X8#7OMV)QA'bHʱy@mWgݭHQ]NɒQGv0 wWE{zE4@VC_ѓid?O/~FWUӴLF,t8c7r*@-9jK" H=zJJ s~UՔ?[~ઁ>ad%p<iMH-}U/6zI74mRk1|<]jՈ><,{')(y6N}fQݸK&kt˳Ve: G@g|qvLݧeۿ"]_>-$E_^n |0J#xy<=[ 1|u_B"7ꢞ,noKUHfu'Y@6\cy}:?~6aFCjjעzV.5ec3O//W𓿵WW\jrxV RocGQ9~6G}{që6ɻ]]Ļ>9|N\>f)I2{% ҃b`|5U_w><1ܗ*V_PEYA\d4N3ѭq4ȗ3v=Yƽ!q ?i{bQHVyXY[w$KNH OcaDOP)jtb-)XePh[)$pZ_M'D]]6e&i)"{W%\{c~'!Uz8*4 )BL8Eh7XqD;Xچ^eO;Fj?US v6o[O^4l:NZwm~9XKfL=3dv_0H6{Ėd8}-nbPq^C1J@ VƓR1',a&!J>hxq'o2\Ph (*\)\/QFxwe&}w% B+ 7`<畞SF2yx)Q3>cQ`` b<ŸNFϳ"JwI/g-_n'\.>3O\& >AFS [Zq# %@L+R̅YI%|<*M+К!UbE8XHW 7^TRU+.AD~ Ij1re8a<8Q@+ߋ7qPM!Dt_`5pj2}G.4>q!-zFca>^ͦ%]߇lYq> (v>;s:oPcF$al_yC.A/'#z5YHXE1ClwI3(Z ץM q:F_FJy$LcL׬.DlS6P-?X5haDkyҽx-f 8V.Ǐ> =~g?Mn`|X cE s,ͱ`cI9觺5O"L$e1G_sؾ_vzYܜ~u!by'fB>-bЋ0ɸL&\(դl,0>ҒY9lnVj!k)!ro=B#d9 |-/hY9zN_:\ғ"'gaxBz\%K&T =dh1_.˿۫9>]Zp8FޓPRLs}JeS8TrTv% $ KiC>/6b< ⿁@.mhC G抬%[V 8s pbX:eX49py3Mpz|O~cC8g ^n&4v#s(33Ѻ#SnSwCޱxJ&'4]ul/KI:潚\'gg5/c|6ٷUww3*QٗƏv\`)I(ύ0XhB>y+ WRFȧGtuiC"X"l5.qҍهel۳<|̔FqPij> nJr|y0ZiAr RSaW#dK=< \ 6`a;]MO=LۦwIzRZIq6}m+&B/Ӄf2zɿl'Qc' jP,tfFp*VSޛB+bsR:7g[eA3R MkߌN:*d=FK=QJ1w/YgQy"8FmA,*V,ȠZq!s?njBެPI;d>@&.[3*`B.3IACpҩG[7)1X\N;X](Fnr[hN quEN.RT'mC.S=0Ѻ!_$D!*H;k~.0e-41мoeډS ٠˲z湭)Y)J5JKO!UJBJ&TY2Au,.}QHq*E+e)@)%RhtU V[%yh1;Ւ xrJ`Jhmm80V ׽JU, QDIʥUv4$@Ih'/-]cKP0>ճ6$*-@g# ީBFF 5%MDIaa&qfLHJzV!HCBɬZP:0 "E]R=kz`;99N{IṊ1m\_u@{QRBf9T ]XIB?ݺ孴}ꇢv}rZStb.wwA9Y+@;QM9xQPJaˆfQ{/*0JH m,@ *=BQP #-@ Ֆb vr)#1yA`eFjK` $׮b;o"HkF(yWJBR:"LIOaEŘP`Ϟo^jx'X3"7p~tNE1Po٫Y8M'O2 ol֨P}h\0Fϯ#0ⴳz}un{IlK]r# SXjFJ$%Y"Q">uGܧHznpwS Dҧҍ|@itv"gGsH20 $f).`dSOxvKlm6l-E>2=)2 B>{M8d\;q%S\W- ur*eE^YYSrǤ$%շ65jS^PצO( GuN͆~ /Z(pBYҽ+I/,&|(̗jϘ=I( e}X`ĞEd=?,S9a͝r$"jD/1 ׊7 *#IsC>hTѯv/ &dTv}'1D-|qWa ^/ϣ^G!SH)cL6 "jL%=IX+/-\{֊3%,H,dPAK[Z᰽cO;(N^"d%&s=F{nz4Zi,Ke- \R=ia353+iIBpDB΋5.&bs34{ s'}-b,bZyo@ƌX -tA #'bDɅV"z4e+e A9\g]]?05K՗~ӣr:TwoG:ce?d=Y Q-'=<Gii oh'[,\ ޢDFDZ2 ̘(8dPCK(~8xn^!0;,{cpOCR䇝n)ڱRxcaC-lo5e?jngpsD*3J`wcF9j rѿ&ll6GoެR (>`\}GчۖP:F?А/\Etߧ׬QҀbPW)7b%ckY:%=u&-IVQ"egZ4 Wѧ:%pEݷHhiLacOyJscBれ=p\0ӓv E8K9PHoBa9(,U9*vU[* $>7eDƊ2(TnNC90 !*8e&Ch]:aT}Eo#-PZ(}XDUTZ*tdw Zt䵔"7XOihG#8 JqAUj TBʾ(cQZt '=O&Y܍'O.rwX^X{ig_lػq$WTr#/U%ޭI~V $7d_md@ Eʖ?b"4@ϋ)&rmSPf/dQLVWЮ\xH9wݭ_Ӧմd(1,>a_~_}sr5eYkU G,JUBͦ>]2iPbǐWfORYrdNMLqu3H@6)!4|ڈ]U1´GD⦠4Rj5DpqjRy)8Ȃ))Vk/- _eTH>(%ZY82V<<\ڋZ5†)-մJUI +=Ŵ߫13pK͙oyXqY)Tr#. 2J%:JB/ցQ'v"nyPqt䑎DtP"tLwې }v Os쬍 5CF{ǽQT;-Xl5yTv9#OD p-؏|gG\w׋E?DDỞ^7YjPH};E}oGTڕ0>{|wl;  ͓C>#0hp؟r>'?oKE\D|FRYӰoFbc;v`!Dsl"6wH trLӠ$o_1ݺWnY6蓴wCzLDN3p7`*;?ֻu`!D[6#m 0 ]aΐ7$o4]LZ#25X"i72`=rEs%)LGnnp$"' jiJf$8x4A ˉ㤹OՏ>Zh1flBs XheЛg_!Ȟ7.E>J?ķ\΃Mn휴'Q+pRnZvnn?{_j¿{,7 [?>k,cp2ACTh^\gN5NZtː!5D;d+;^'R`aqᚎc4JRaa[)qVJ%UmRʣrg;Wvlۤ>oI %hl E-뼕j5-=:ZA[)eXy+%\JI}ޒ#48_答CZ6[RcD1?l++qҸFjcG+=h+˞H &aqRjA?ZY)=CQ%|:oDXvRcB3a[iT:n-5=gį >m!0=Za[)qVJui2 +:JýhH+3~-5L鐭U}RKK}ޖc\zVJhQX)qVJ@V*"T$dS2rޥ|;,c{ezKI 7wۇ pg`&Ɍv7f0fx?~q_5'tt@lOaA";,. 9H̆e!$M'̂D t0J-~z-D&59͟Zjgh0_˝7k~} _a@)i) 0JR)'f*}o0+`?۠kKps]k,k, "GzzUne1%#3Ͻ3@a68 E8$"qOHi$I"<A|fg~|Q1ZWo7Z?z$k?ƻi|P2D;~(Šv"Q3 X rhGb#*TTU(LiSXZ0-s%$A*,qhGUW xD)9lU+G\Sವ%| QB -!"ZT(*ȼ)qPGPLBR9]s5%V=Bd+8E$ݶ)!e3)9/F_LɡmG* (*(RcaW0$XDif sK( $TKx %( i\Qd--+y kiTƛ.-jۼxi -}Fl8ŗWO].UJ3@~99w텧 jhx].͛ B36_oK8D;NŠ=~{͟@jI%=\_ڑDa>O&KMpwu[?>O|}G^nʤF'odH #L6.m.71MDtHP5ÒV5{CPa;+·&Gfۄ,r[o#cH^jXMzر |`N=wCL&w"IGm3Ż;q,ra/ O7'Dف :i}n_[!Ɋ&g?93wjvGM|p?޸f#Fqa68gM_}j>0&zaa. Ѣwp[Q\/6D)3Ħٴ_]z0\o~_wg N>)A`A# "Cbo{mc*c ZE{ނO:7m'@|si X kMZ7ƽ+uF{b7-^v` ,%~w/,Hܓ]ӹ*5GqRs4fp!mRc$Shzb];_&}}zn~SϥLzh -B K{1T| ֺMrHtonm@0}bG&vd$ NFs6@i}w}-(,䕛hM=9:ͻqruc:Hn{% I #[M4Ǧ$8x7#Ż En]X+7ѝmQl'Rxsz{錷sbp n_8]>ԯ'Dğ.|b S)EE,בKMNJg(XiG^^TV^BUbz3)^5~h}OF$w*FŘnl{H $YdA|HPvXE`Fw!CHPͻ:hg:IfIDXU4 ꟓRDR?g?& .|4j,''OFc ײ4L1렽_p֑M7#j(xuw= `Z(}6fsK_)%$SU1eB)H:e*fѴur *cjȁ zY|QnxGRs?$oi!l.uRc0Rʾ{sY0q'l ]7gɠ|GԺGЕ Dsa;czD>H!_bH}CǾ#g I(8Vf bN+JH gye DB*_))VX(*Յ6IqUBZUzvtF"uKfsɻ9@V/ͭ4 (SW,RiQZIYbLFХ@$jt窸m[4Rt&8 '>OXu[ =YL"q!HzĒbPo*w>F ?\0` VZ/K7/<~#_HCcTf6¶za[}6ؚ+ƌa{'9eE5WbtBrbkO*$U/UTuN׈%Q9,R9̗ ^jIbem.YNmjHfvk׶ q &Y:c9&iz2;L*(ՄKR!>o=SI,NUZq""|D&tHSH$3@~5)ֲ߱g-xق7oΠHRL@T*ŒErƤH1˕<fLpn R֥v.Ynޗ"8Θ[=٥:>~}1_ږ߯>zqq'^3Q&䔥";iW<Ed JYtCbڞ$Dw|ﭤ 3qBy^|!7r < ENZE&@NjX̌mCRUV"i!jRjo\}r׍o؍wq Z冷b G]1#Cvi/\=|~v# 5x;ˊ~V_\s, `ozv?_Yb:̎,g)k @*6V3hZ9ϣ1nhy4pHv,ױ#ͬ*g<7ܪn4yCWPiLofF p;t { S{lTR;ٗglͫ 3O/ndt{3}<^f\CHV'=RD@˼8z)5l8p8h}rtEx3SBI-4}YA^ig.\/F@ve7+ʨ\y\v}KX:yn前]- ԭׂ{؅|.})_zd'@͆i|>@M-Z3G^Wq{oF?X݆V C7{ JZ+s;e֭[Wũz66& ZH _>oe vɼt WT28%Ak)G1&^Žk:SIV =T ˗v$-NNpy$Ut(RMGFVKLrSU$KIZP[V[j- p{G>ْ:>ԂՁOwTlJZ'ZM C4SZ~n»:ݺ[}Yּ[x¤wk!AwSBuwTn"M [nWq<[y5} $~vH&}dgUAX:2ԓM4dp`^٦]}Xɖ6Jܥ{()53Z@ka/+yBE7̪(5˜JۭᆄR*V&v Ox=r.ȋڵ3JegY%Z4z<ᰬMY&$l/w/o{U}]].l0ӹI6SCLbJR-~vt(yJ3 2JJa㤑Nh` sD2m~y Y`Z2޻Kl A1o0t2ӞwH%2,Cfg~Z0w̻ 8uy#I+udabMnȕwMr_q՘)&OAڎŹr̔1XQRxkS(9rML01j[E=fFB. yic2m.b E,9ĉ0rIjOA&b52'AHk}@hv@(b5 t!2"CŽ (Pqy "SjFX85=\j[3\Ldb(O˜+Ǣny,r J)4Fč R1TLG9Fy (BhLԻae!4ĵ/9lحH[̞.i TN6@q#-<2 rVؙ "%]- јCaQwCY g5sn&\U6U`m^v 5og;Q)y; Ps3o+j01%3n.>XU! nVRvviת5"" }T0O|uCS-2B_5mMT+?-mV!6;moe֭[F6YTrٳȷٞ·/5gc۬>UaPU0##l AJG y{78.X[j%H.õy q(a"8oEd%RtM: }G&NJXN)|IB^9D0EwӢл::\{]6ۻ'yz&!)sۻ)BuwTn]"6Ѥ{z&!ZSfG@MSCT>OI= vc%Q\j"AG.s c_X߯,Q6h:'whodõr{ m2DZ?whtS|ك:䕰tr2gml sl9[32+*Mi8i-G3:+)eNa Gk\in.rrU3(Hl쟊6_onl_>GjG6]s@Wa05z%v ؆-*3B҉*j}%\Ӊ@"FԘf+jc/=mT"ꀯdl86<cC8**Y*L (UTt(:En3i^)J08;rMRsmW.ȑcyHEw<ݫRZuh[A Ud>q!8jT2Ύb3 x *]mwUK^m2G(v${ eQݨ|]#{ @0t6~lɠ&ݿ z.<cf@.pj[B!tDȘV5dERq gZιJFY &@{K*Tou/=[:T ѡ$ur4*ιH"-EU(HAH$EL4qTx]tk>d,rgnߪNJ7h4ˣt~u6h8 Yl?Y;2 S$3;\T̕!S;<+aݳd*6_@ +rڞ!wAcK̖}opJ* /6#C=撗+h2 G)Dsb40\kfRG)(#<(`*!!\F!j4BKMZrqiMg }e}UQXn%*~/?{x% `\=|~ 5#%YV|WzvW38/=7_ߞY2,~QDP1-=z5pm?:<5.Y !#nWrTrFlчq0H3n_H-zW蹻ECv:dlǍlͫ ~NBc#}xnR Ժ ޽* {p5yr#ܭ!]!wivew2n 4ֺv}zyY # =J#<Bdv>|=߁xK.bu}T:[ PF*|.cY4[Cxg;˒aLҬLԫ_㿍G|w{z| N 2Z?aY imZYb_t}ud)džl3^n:wq֏ZiFh@ژr]V}d47 R+̣+ׂKB Y8-|߹/VMy7zBhUҡ` _ȨC 6}}5+1$up$ԆP]+ucvZTb׮okY*KlJ˴~v T,D\zQ>:Zȏٶ]=Vo-⣳j&nt9On.]WZܴKP]éB߱u=vr،%S.h !C4S~n};n:N;xV_Φ [n ޭ rat$wkP@'U[icy8ޭ rn`JzbJV2V:8fCnt;R^W G.lR15.<{-^oV~6^oVi%նhX6z}Mn >3)r6#ninɈiWpH2kml=~ՋSQv.n=lIC$M8~X,T>a"&i ][<百l`>i;%8$7gTK&T0H}"=^)`$T:R+jh*%)h&ϨIrhB\oMXVuMQ*)JA/E)_cxH {<\]43 Za2&xeXdhr_RHbklTlT8!]TQa!l e0Sf:r!tŖ42n܉@n@bgQޞTDd9(4)Â@eFMKͼ! SiO-P;E'H8[Ź~g/A"ںY0I]me>`("n㣞$9nyZs5^ ( f~9M^1؊B1LQz5CuIO/Fx+~rqu,s|9HV:ԚԩTu? G̑4\XNH\N~؂r'Ƴ}do _I Ijb4~yA$u7kҀZJ@&$hT0[T)eHce t =I'cX`vuE^-b;fR צXC>[TLvB=nX )ӏmτ0"^(:UNŽV 5hAD8_'I+`#/_StKi1Urf5"G-*E},N%KJ>f~ۛLq9-uqiu_|c1|bdYO}QSp '*7CKXI;(X?i̓WHU6zsva b/./חemvEt69|v承'+dj:s>qtzyk\?P:I AuS*_ ]s_doN>Odg`q{4>ʽ$V?>0sVun|ySSTj|ϲɊ/-⹢gmDž2-͋., VMg;qf&,'ӯ!Tn4tHo].ϻF"̿n F ) IϝǞy,}}e-Š/[oj?~l3 4p=n>%E̋6^\/^Uy/ekLgV#Rrbaf2c#ƨy CQmoxa`m8c!AIͨ;\_(ck1>Ob\엇AoBfF5HkL~{D4C vDTQwQ -UolvYQ>0i^l{ AdC"&o 1&@-uJŎ6f] H/N6ظ}ym˲TrU.2/ 6sa{4a#_vǢ/ i2+CeP}4 kb9Oca/͡ZX;$s]%za3բt۪O߼5#ѥ;1u)b" kmcoUOwʛɸKҪZ+.3BUrΥJ@fI!*qW|oaD! `ҫò3# }nƯ;nc&n}Wp=hC I5yD+ Ձ/B (+!6ݜjZ^mFyGç5}Ja\7eFG @x%ǒ<W:%U(SAeg%U}mhW(ß׬HѺ DubQǺ}y FgZ64䉫hN1![7 z-!:֭UYxt@ֆ2c щ-X^b% 5-wk h;~ }e# ( >&(;*=lIF=y,ƫ QC+t-NDeh(R4O) ]C" `| .(Ȟ"\gZ d!6a)"D.H* N)S (b!XEhf gX N%v/ue(jY`iIObq*J,*rrIE.tBMiƬ#BsK ׹p0q::"ҪT39H]X"TZJ(H;aLr+]Nr׸X53`zf5)µP3X@H%?Ȼ>ZQ #Q&PKDN:C d@7믞\8@AGO[,{m{$uX$d˳4F5Q+NNa.tژg(;4F\lBr#c}[AC\3lA3Tʈ< F,iQf"nf9gZÍID4|iV2]U,MB_bdwZ9999JZ5)oEK#Wg y_ O+%7J0U' dh Mu7CL2G[ޱ i  |cMSR k 8i 5SZUIg9_=ƅ ; )-F(&`; m>sOwʛhD 5vLݍ`TYHtT|'(M1 :kS_B3=*|̦[MnX`*F]lr b}[nѬ@:-h6Y$Ʉe?`Rd)aӸ$H*,,Zͩj2Н:R@a Hn\r<1x3! LYᨏͦJA}bl%8}_7%ӱg[ QzbEI|E Iakb̔!|ttHІ@_g6"D`TQ?T*P3#tO4D[ک<9LTd1Q8*FkTjǨeڏPjz=A,7fW+ )S287R8S&%I3'J3m,\ `@XK2gys}`yST5u%GܹXzž\\%w y*S~^nZ6G˃Չ}G֟e*CDhukCCF8׭TѺ DubQǺ0ά[tGC[UtEhN1GL˘aEwa3SLȒ/K^4ڤU,{; bL>tArod`*`XC`^'b[tR((u| g2W:> W;b|ŅA6E Qڪg@r>4@6B4yP`=6Q-.lp}m75j0rbXwakxvt窼¶kHS C3BIqWɁxxvߥT\bgB ɮ[ök$<6P.uЇ mc!*ܜj)dձ %ͩFQj r D4]r ÁaS̈́ ~RmDg,BgcQ'?eZuABІ Ƀ Չ|G-֛eZ)׷sjZА'Q:e=uV{n<QwԱn}"!3hА'uSFk93lϐxj S3rK`R6@辵=q0<5B8NIr8ԑ$:.3Y_I.f: !^.Suϗ,Lsg?7Z=\]s45HgU{{pSnks#;J$ח :ϝwb],_ߗXVzn Y y6DeXBH*=] I+Q"5I]X"лH%R1C9J3I;S N0Jm_ |dnSt{0< -gC̨:~DDŽ{ &W>Fn^\y2ZOkZz;O}x(;k*1IG~rv&^dFisLݬ%W:@ bcnɯn6?E7{yYFkޙlRn1En& KpX<5#j=ҩ% QT䣕&0c~0 (j%.d][ˬVI֕C y!)r$ ~h4ݳgJ֔1D!M6&αdO&._M F 祀h"ctDdO2Dq `GL^cI)XA+ 0pka<#h o8P^@XX"lĉ@Lǧ$4VDI&3㼣4YRyRʵ9)g2 4dZ P\ZM" nD`&9ۚΞҨ^ԨԵߟTwWj|)BzpKqXۮSuyVF-G6-3D쮲BreŸ撂DˆJqC?u]%W}: =jZcKk`p@׺mjpCbqXb1-#&?Nq8/? 2?z6rޘ%&igcs;"~f򇰧`/>u b/{P5.O44rvׯo}v6OB4eYBZ\@@op2?]~_$X‡U;J}4J$)V<ҙ|}DU7< )%Ύ9>;6T SRRE/~0" w!_Mk+T š! !ew)HRp>U7d3'ad&1`$ǞD8bYH&%s&Bu)6,{Й 7+ͧ|$RBUGY ʒ.ےJ݌~?Ӵ504^_`Gd 7R -h49zB4H:1IQbdrxP/z"ERX4 s?z&cn7)it`6w<nj!y-}z&foq# Lk~icʍᒄGA#fZˉ*SJóJM\2R`\RLJ1m}y}z:wG٢}1z=΍tE<}Ta7_?4`~o& &@2_] d:[݇W:,ik߹57/.n-p |IY,=|lL *_Aw]塜/G dcܱzّ;Vdg`}V5nzT"tjXE:JYk2]|.jlT~a7p-~+sH mgua8=lH:JҪjpw,2{ҤVl]n#$wuɯnoW _oo.F-v-ɝwqqm`|[:7o +Am3}$k e!F(Zʜs"jܺ8캏P ^5*<]8 dHB 3 y/,X:pV 2^T=: YzaV̠TG8UD3N[wAcW}|lHtn`cRb _ r Wf^$[ UAVZ " N=R%Ђ`}NQ2(JmE/UYh5lz?'Jjw7I1`_O㇇[حN&s7?|wQl=ZnZH󫉙~?]u~ O>I"-ޮQ#=ӻ#(ߚE#JXQz=jVfx"J=<}5FIaT; 6؜i)'#j~SK9I҂jd z-=k-D`Dcn1H5%z-=o-k#Nusl2Ob4F~ggжSm<>[GKA.[צW~v~:xbz,~.1xg쭏j\l/IkҁXA)~Q}N5=+hE ͉:-O՚^KZKHR&Ob'5iAkYk)WiZUf`KP_ĵSs'"qŏTx;g-N|q\SwfjJPRpWQd6f >n5x7}Y;NUn0*&D/nY2c : VtX3#3 Kڝ̎ jqƃHAp 5szTJ76{4<ܐr1YCT\+29h;a,DKyzNK`guغq!`^Ɠ4Am<2ݝ~ڰpJ} b"lXI[ lirbؽ[E.^=\~5y}}:}UD|*L5Z0"^'D0 TUyA1r!…9 iDu<>o֩\h)Cr4!TTl|=#ix?Fu;4̉3/(̀<|9x`Sbuj2kY#ׁQhZJic`wr ReApHXq;/{mtX(Gۦ(]7G KuPQ +8h<^ CA_Mcl}ulL}Fx8;Z^R8+ɵ?w8>?ReS-OޖܬSTe9@#́`,J-!we&.B:]~MU8S~wټpڼAJvׯ˫5u&*><Mg'>D xt%̓KO1\uA}Gv]DxPw[TDքq])x78`Lı<BFu֛=:Z ēW/fry;W_ƐBR`rui"󴨜/I$fjz睼,MKEyrSKQ}N5ee>o-iBD8 -Ga 3RX6ɕjN|JΧ*ҠB]XTH )U@?h,lIE|ٲPPS(0)JŲQG(RߝPdfJ`ٱ΀&rWe}/|kVTn>qү_4('PƇ$jgC>xSv K)/yUZ9t%$lIE9S!gP8I,;SS8ҾeA׷55a sD;A8p  x'%s) LN6^ޕkV<ZMJ$P^@I #Hf2hB40b!z"̑ЀZ 1Ci`υb3e2&1+\[F0GFQ9lЛ,^P%bFDr;{lVw [X@$]ֈ<[TD;Tk64ޑghs.J^ʳ_nм+[NGj.&Q$詖sOMhd:SAgvRZ@B4ı?#7+DpE~bj5/,RZbhT?,R$gzZ8Y J5r^bt~XfC.,Ȭ8G5uHNW$lꃔbKlV^z\Y'h,W.(ZE&D/#[΁aרvƓtkdYRRAtg5,̗Ey|20o/Ypo#qy~p{?suYBE=/4NyQ}]Oy8V ,aTwG9D=t7(lO~$rDg~L0-:󣯩;s6-=} DEX׿}$>宙 XJ_W y>aQ#p):rcȹR Tst>p{0sJ›{ g : dsȥT|$sdM\O+-NjVZ*{jKZ*}( A`\dpwӽ~2F*齉ޟ;?ni03w¡(gx 8 D1kso9AS'kD5= >6hN5ǜ4_@5x"6Br:_`U$CfS!%-9%e])ǻ9#*h?wsn:fXӢ V$-pKe*DN`ٌDp[(<< sDgdLݒ'QQ}Cr(gQcg6'E"[NSk,KHi|)m`cd[^2RO_4ioJpmб<k&UE}ŸRe){|7T|3N;W)h;{2sz?mi7x'!`B*g4_ʧ{7d=Q$V^#b L;JAZ <\@X3f%B+ tJALъ.y P?lh?|S0n1 ֘jrcTbM&}bwrͩV16fsN5%T(zbTQ,rp7Yo9cZ2(b?,~y ?Y 9%X\\ds_W"%喡K;x`ů /NƟھ-d;AcCMf'~I*"wgìlo^\Y]qOYTTYޔd!߸S'Fzj[ \L'>mKY<޼[8wkBq )%O>n0 9* }wr(yf܎ޭ MfSC\E"Xv|e9a"fU dNhlYq37+̂Wb.OJQ}]rF+=o+UV"dS߽kp\TۇiӪ~ҡaW`r3f+8nQ Æy3wυ8e4M~x>|W# ®JH4^ޙأ)$EL39 *^z(CΓ6W5bErEiy P3MŰbɂBxZ lVR0}JTZOs31$c= ?)0mn/feni'UG0us4S|bY~*0./x}' z:;;^$)AR9|FAG-5r 䑱kC[u TG}mf$J3cuҀ^QoW$F±C5B]-"y'-K~\48Ǖ*sܐk:ZՎH(NkvݮsEnLd"c.H'F<ɸQ&rMD*M p$b)1KVCP$4U'iRFJFi%Ĥ6Y>a2q*(K02ͨd_CBCU" s4`a֭;Gqu"ܙũVR:L $)JpIcxF<08182ۑ"q"$,, 4=)VOƅ=6ߞ]}0B)⇕:20D+x4b^YB\LViycY'9 ;`890o"k,P*W3>H˥7~i3łXhջEan%~XP$ l":s)&Ë[Q% N"56ἳ>Q;S]spxY;6YE<C=l{vv"fOWvەS 7 ONPJWz;UwfjpO0~ڒV=#77yg޵V4Y4_ nZdKkR#.AvW> |x0S!0:xKgQ 4%Z%mqϩ(^o۾!G!_0ATWp vbaE^7Za԰>jJMr1Ӱ xpvQsP]ujI@1->~-KV?@ԯ M4Ȧ4>uzǻ)TAлtB݆#ǽyIGֆ|&bS)|w[ \L'>m]s[x1wkBqٔc%ޭ dv;u>2@fuX>Xl29BTàzLz#c(_@՜¢s5v+նTj~1O3%|uP>ݻF/?KN%Q[= ڽ%jܩ0n̿נS8Q8,"\ "^A;\?VҞ)Tż{њ?9kO1mٓ6oQ&(ӕ-2!"r2[uSdʄoM>Ilm8߀)A"M2ް&S󘦌DK5Ùѩ &C D3y.<%qeibyڂ‘jq ULݭSLWRƲvڅc=XJ KȌH5|G/\{Jh+dU)  ~X\:p'K1r%%\30o$8.RȬ zsG3PMO4]LObP-e~ZvN5aRF'aǨS2f wKvRK5F/vsRKyK&'aїҗ2>Z[NZ9зb gǩSDJG+UHYfPN[ _o_/{]#V#Q [+%*5]0APV@m4!صji[uz \})_=.Nۧ7 M5L)*JTT p4?$J){6#5Q-i ze*T|8Lϳ=IQ,.'hQ7+qqՁd+qymq.^V6[瀯Ƶdu"zFIw+I?BkjE圧͑/ة5o6{ukM}D¼&E4 T4oiXӼyS5Zc"QITό-VO]N>Argf&ŻȲD3lL$8Mm8BdyFiFN1~4 bLr13i<ȐFCC4vG\)oG+%ULT|RB<!@!쨣 iDD1H(Csӄ c ZȉJP40UXI$3q1V-Ƣ`ߧ_ol/IѲX==ɖqHS· |z+A9\krn؅(ؐQR}2G?Y}xu\)wL>V[SaY 0ߞ]}Ցxb8.&xaf *&>V<bƋJ?-a,ؾ@X I){Ɠ`X\{΃ŻoPjK l/ G ?{Gnl/0c"Yݹ@>"H|Y`ۺ-GyCVCd_]Kb,#$.Yv5~'j Nq=-۸cs߯<$,ᯉF$?vN l)k:wPa7y~vWP\_rM!)κ{H{3q8iNNJV`Vp4 ffL 07+p>eA,A>ĂcǏ#~8[*˺{6 |zt/7.( #3ɤţrXCͣN#ƾ<2t\;yۀs4Fs]~Ȑ7)#W;WDB@W`w;C$Б"IV7 o J |Tj^imFiH'i8[:[}dcdOX.DDSc6Zow޶+y떙/Ŧ-toS|p.nnʍ5}*mm@3dK//o}#Wݯg3?xJ$K]%lxH_7?~&p!gI1yLݨQ)MR1 gd/SeFni٭9M)g_qd7ӤX0b:ψnS" of>"CsLieTlw+>}cVKA[.ͷ(M'ָЍm!v(rI.%]}{\-yjIr )ߓ~!|'놴L& 9⟿BA%t)XXH^T *Z(*\x6B"fM='0r4y@|XW͓Y ֳ!:W3EQ0r41hz3C<ăg;@,4"- 2f&26;ʟV|c38瑛_x5{ss>/rl}^/PoHx 7@(s:fm5 U )9ED)i꾤܏I8Y .9IctmK4 S%+&̸ZZnPGGA`0 ^jO vz09HaJ[.*1( W9 9qⅶUixc#B 1C\b. Rde[/r&ES0cv(y^ s.|@0K&kpV OsS 0At2ЭzQ ;jS2eY2_x!07 ZP)E4ZJ-9/^J}G#A:,p.af}}#'MHߗ.4.Cs;.=g3Rn;ykMDtg|胢\Ќq"t znbL#ViKbJJhBY^,eFji8eU5yS3y@Q @+m˞aҦt2`H41S\#5Fm2,3%BAYU)U:7LF-Rq"MΠ mJ0wT "7hdLSVd%LQ -';fKݓ? O I$R c5a^j@Paf׸ܷ:A^Qedsf3Wk~,/+ ݬ#GL9~rnʠ/Ym+IG^A[2ai:)wҸ0}3PhJCNt_q1Oy%ݽ_J([u l^u뙋eRZ")}[3ߑe7`I{"a)$ :nʏ8v*uF?ܬ7=zwe݇<Éo m~=-l|#ξRR:-TsKYe+EO2fd%Uq`Z&2M`0J&FPW)ypKDAYotNDKbqB+&XT` xFWLΓ=G"FD=ѓaG]Wx^H.G6](N.{ײbSׅ)ĩ>ם> R+8(CVH j%>jekزJq]ݭ7V#uo׽#Hnolq3ZWsɪmCIAC'ghL7?8^j!gI1egS&b$(uɪ/S#hJ}%M~-sl𻍮,|#:v3YrYY-$lAUFTĕ%3bq-7jB*cl9g)cN@W"M/):sTratB¹+!Q̇-,48e~bL(mRL4F&Ӓ8dJ8)=9vA.X-9^=TatDc9$hJicE*)/Leʌнh/|cϯA"k*'A!6&JVLer?*b(Ke2,ViYkwwP(%f(,+ p8*Ѣ`r1Y*BB))JDQ,9"1]w0o%gqumގcVvVKoǩv{;J9UՓW=Q(N}ZW1_sUTVOyθT`@palNrlTVm (?0cekvuF\CP A,E/F]6C~Pz$`C.I6!39Z)ϻF_OR\%^{%`O 8Kn ARûK rC4LX.ay0\4?+÷>Cr3o}|Gc4-*HznNdA0%̏7Y3dqNV*m @><o3{&|zt/72(|,w験s mg48/ ]?_T3Љt5;`pɤ2FOGܵ` p@!j,GVO8=LWq0pb qH ?u $e0Bh*sEY0d2'JszIJ֑E*hݑ`,|@PMeh8dKV\'X 8La," 9 o*ĮQ 4tVpPrYd4F%E7f\K]tXnPzn5q3+^{z]n+mg7WۻW/.vCMk䝴qUÝ{~~cb=iO{}{L{/(o3Y. n)~YJH?oo\:D\k7\~bYq'H4&K ,1ztާFK/3ѤJY6bT0b:ψnUާ4sh@C[r!SZf|Ħ8›^mLЇGkZ,;y(iRO|2Cȍ4Eb|*kAH4'{$$o>=zoFO=1LܦĎ8P)4X28:F(@DCs&QwDMŖ<Ԏ zQ}󏄐C7AT: '%{>jh~M@`O)IRo놏m\ [B}"Y6͢A*gĴL79xB^!gI158Of7Tr-I}FLvr>u%h- /0!gI1+F - [*!6*ܤ?ZvK7n}Dșh瘒V=:k HPcc}JcVv6 M y)gMIuyPzնKD G`oY$I97l(}Uj(=(<,J9NgǬ^F^riGiVD9YD1Wm9g'aQ XK"Jâ3Hvb`]£QuғRl՟g:tQzUj%vr|8^_VOAw(՜"JEim5 ]fӎRâT{:%j_zQXDNH_zնZ餣Te!=.!6y%W.F_s[S@)g.g33J?h^<(5=(x1gT-y4SϏc؆c'v>a$$w(Ϟo&N>__s+?e_67nsϦ8fo0[ٲKI\1ν֗n֋+ >|+~auw{NG9?_ݟ)ݤkZ_Ro>q,V|drRBrFK+%9sff"ru:礦/DzAuKq=zX_PⒼ:yDJ3p5ݗ" P7*~|py:xB"#,"$04m}KҒ{>}~!q}\٬0(3@FJV9G2ҬbRIi MUBapkQSi ϪJfeg7L\J+A_a [H]O2ξg㸶})MIs۶׉t#)$USK3$?n ,fr%J%ͼ :!"x~!BW/] )fu,kcY٬/#+YS׎ޫV*ߨ[G=; {[{󢬿U;Fr51;!j}?^.-[v.ᴷ#麩gWD\4,WMtƌ'xܟ7#kZgͻ-~3q_ ؾ;5XA?F,Hڽ;#;ݒw_V*/bxEF>58ěs1[~ϏjΩ25xI~^D{F$DT;UuG>@CP^4{tmnĶ^l~sEb ~H_\x'8#!ݻ UEfW$ӹHWW1D)QIɦŽ;qPwաv 9E7.Y my,$}(})4iGSG4҈m:s\0t_n;PRvЛ:Lw? )P=ʽ+{S(^[x#RLAAddJk0[IJ$$F~Pcej|ԽcժXjWa=(>} -Jjdow[7z$Sو?ܱ8E*]im#iugï{Hk "ׇm]9/֗w7O@p* x4qGG$_XЪcZuBh0t4~6ޡpXI=7bN^azzGX J}8x׼a"^2Sbc$q'mb-$4`saX%/l5 ;E/Lm<'?/K1\ ,\eɹ(]z 3"[tvꓽв(u$l*ˆ d}n'lჲY}jNתƏ:ُvi2I~8iwhtbMrz}ŶBIrHU(R,EEt U;V0 Uotv:?<lx#@s78Q|Yb41yh~FdAAC_9 tvGx9{V=:OڲT>#"/J`ќֳAjv ۏӉsʱa+;l:A:GFJ`Y^c9 vbfJg+1@-|f-'-y{ h:)1|b6!~ѹ,:Dl 2i`3.̓hot~^GkS$} :(dR SJ$5~~"&p$H@U''Ȥ\L H/B 9,^$"{|Or;=R)W[8bBM[Bv,Mj&r^}'CN!5/Iًrܺ b^Bꂁ,$P.')qf s(OAoP]X^;^10Nl<60wv }b\]}Fq/Ũ|Za@ ߎ C9heYݕ־@Ou>%|_f5Ԋdkc;tW-j%Yy~"d"J nʣ@RZePd",$E)\n{"#e DcvXg)"h؎N -<)sȞ`!ҸQ c/D6Z@;6\2zlH3wϧ$IPdAU)$cIcX߽{> I{nG;@CC[U nNW "->ObVUv'MCUA*Vt M.yiV={OX\䙗C;I #d"RKG%M|_kA5He>mwfJ{R¼X{zNtz+}k 44Ӑ{dM2ۭ7<Nk7VY5tN!-k^ۋ\  >@âv'1]NٱE5BĶUΝ,ECJ{hJKV\,#or֙R[ԑ IQQ$ʶCɹ}{@X)S2dGfWsqv.rŤ]OkY*y`gq'1'7Sl=uu.7rv}߬޿ϲjWo8WF]..|g4?{ĺƌ'݇Kԛei*DFcܓwSokx ׺GUx+ͥdϱ>iLL dZsJI2P)K`^7SĖ3^5ya;%ˋ=9طrb P.F{ c$&S+BVTRט/JȠN~&hR"IR&(#J@(N3nYzȀ؁d@c̏- i2x>0#8JC<*Uf#! jҪ/%ؖ6h6_<9,(Bji"z)Tp-+TL΂rRƽ#oGӍw ѥd';I-}oUsޱ`Pp(޵6RFJlpkĹٮtߗt.$.{|\ձDDj}(I p)IX )u` B6 WGE,< ks(Wg۸cfVHj픏Y@+] a,C_8TfO5|A )0nJhrA RBpcaAqrC!V!&9-7Cg岎ԚrOa#-AX}6[="$NΖe6P5Bgk*^-;Mƃ_^L7sT^WU?w뼘㫳/Cj萼U$UҰ8rIe9@N 3ƒ0pxI<u-#8i4ίlhg5W_ѱ0T!qEףc=Gp[Ϡ 5^ )wZc#iZb[5UXxX7dXzXFx Go:1U淧1K.{iVn׼E4)VN-/eB!ClD]EXDv=e״] rN|c;mGuuѯN]5&tR&|բpw)LKոwCN!sC= o|=+Rt06`wY]ߜ_ %uW ;murr$P>ů\t'3.l=%+Z|~'fM{EFRf[FІe{w62/-\^)ÃPeusz2Rhʡ4d &OiSZ/JE֣z Je 8{G= YG.6[fe7#>=.O/(+˜ǟ>"+mZJ6KJi5Ӕ9,ǯ{LW_VUl[`וY-?,WU'҇9g2{J<~J^{/;\a~2\yAϧEg?ym=1Xߐ<t~Oy}eX5++r5[ ɦde;OfQl״L=$j0]voj3HY785mLFŴPN]ŰmF\0?3!jCɥ܃Nc-8VT!c-lg& owiN0j9\N_\Κ}:^@2zv:_ @8++}QI&x`φ%I]<;}X'q}.׃)]TT9-f<7JWMp Dr3 >TKf-墪&e1`jAV̭vlzՀ/` q̢~Cb1bp%(>e]ņ߰:=f_Ȏ '}2ըNl`t&V+7R:f]1h{dgco?yy'i9.Ϧ=1D6^WCX׭Rr([xM};[.M|No' 'b̗jfLtEWT%HJ?c7>%tH7[7l}Zt2} 9+W_NonK->` nR[DQpJD}] N"yᾔPK~9.ÉHRΦ˵\ a\fm=nO8ֶk7\؏l)||uFb?pf@>Q:#-/jF S Ў6(Б<(E[[ѭ]TQzr);PW*EH (B-s@xxHvrNLWNFYLx՛p,K+΄p OX!@{yAൎ-jK.C{##G0faōUoC(M zc f;RRK8}qJ-Mg,iR΄H\x`Ȧ 9nrluU?l1O-aNЏ9n~gjc^˓sN]2xIvV؋9ݓoޥ_=˫/RBD+d5s ŢrhI4yëGnx=z,һ;(xFWD#Re] yjGxEgw{ͧU@fu~q@ѓUw#8FJM#{CbDz5O%ړ*wu&W=nN hkwU_^,E u@{;ѥc.˔;9QSYDrH :'MqβZs>/!aW7뀧Ȓn2DUƁ>P&r'D8ҕn$&Uv >Pq9cNǺF8%]xQĐZ0+%OɟXKN)par(64 r`^ vB=& l㜡.X8i c37[X̏q\Q7b㖰D[B|%C٨[;|vFJhbCgCW2sc).!ɇwEW59״:9D$ 5G/v)]v/rsH r;-xGr'W"SͅZ!i'@drjfUN!Uq"EgS{VѸ.PZjL| >b϶Ee@OQn3kr跳%4xW1 1S< i{aU @hgm*4!ۘ({w/}kgNDz.^doN= z^ 1DȃyN^1xn0B|$cMA{Iŷ5^,(D S ̷LăT ƆLXZΨص9'TLp.qâ-cE ^6l"vaw՘xwA0Fo]56:d.hn&&R+iS>26cnSкf:  I|t o xQ#0f'7,#Y1ae&ʼn AWF,bkuRpzUNj&MC9Nǫ)).f8Zb`*QA&O%fh$e17.1T%`缾: b$$ɼk$78(h\g]p*Yo}xxٻ涍dWP|9ـڪM8pR 6E2TH Hi@v%L˴Qt:*Ȳ GvSnj9&[é%JJf/e?1qs8K(Qsl8|;[˙Ԓ˹= qk:jtdYh5ݗQ|1o ũ]9;%b.GWI8I ,& c#'FDyJv8.2 яL55T"S Fw媭R=.l@b6ŦSɧ: sBhQ:ߍLg32r4=LLjl O망hL6È6Zmp?V?RDG݉;( `hԀ-#V 5{*ߢ.8;GX~_w럊ƍL=؅\j#?h0 n]O&Iwض&Z(^y񾳓`_t;-A'yU-˯s<]yso}ޚp]pbƙ>X  yWY< |ytrmk߾  f_km8l]>G[OruhNg>-ZҿoG/~>s#e:+EӘ9Zqp)0eʑw5FېbѢ. p^~D]w5vRR ZwF"jЀq)Ќ?z@RQWlDa@RWB(8R :͜1 ƒxx5]h ٢C(9"}eZAJyv|Uy#(9)aAsDNWj"⇏ZT6Q^RNn6϶ց<*38O8 kSЁQ4h|9-//O>y&4[n rkD?x~)ڌGM * L`aSX" [2Y˛v@GR_hJ; ĐF'W& /]sXߢ -s׎K~@^Ǔ0q2%Kzn1:7ތ_{3~uٌ^`GAKy&ARj1%<q# A&O i R.yAufo`=K7룺u7y]![<4rh`gM@;yuڃo-\1Z/PG#8h+1Zi8d,҄ V#F,TNI=|ӕqj^VA=kCWE-bUxC!m Dz"' K 2 b,J LT :T-s]F6&8<4HR\@N;\ZS){Θk!l c`#5Xр;b9F2Dx`4 n "ÜRۄZf\yQ{ nU F^qVåf{A͈;gq1m(i.nadnxgL#\ L2pkHKRiRLY9BZ'Nj00M; ̀/`sA)t"i$j`8UFQr랙z;|G{׌{i4{:JǶ qyUYB:/3/튢㋑?{`f7+bDrߐ-ͣdƑ^;yp$}.8 @`ZQό'1ǚPQ}8 T 쟧>}g[}0A}<*sx׽Tz'Np^4g]KeA#vp9&L^ W>fqrFPӈ`pID¶\7wKdzUp ɋ~;+z򕸻EY V#_+MiLlN}/*~7 L4B,q qpuFO.^H{YO ٟQ;(J-Pqa j`Kp08@I1oh ?5a `f$ҖT?8'<Q L8&Hag,:%8`xT,)!rP]JVTԯw1gt{ )VvDw}_}B10%̗Wq?g\hU/=#CH6)RQZ^GڗJdu@٣my~3 NV zoMzaY$_M0;XO.aG+-W=svV;NW:󜬠Q-;ILI9XJpkޟ^φ^ *qz0kPi0ﱊ1BJw /Ό.o 4 j:qb7Cg ܲll*p {vc:vyv7}@q뻨xWӛ“5z{{}`@S'uU Nڵ6a6.qX/PiV._;%b7rsquWKShmtPsgʃꃳw_pgH Cwn.WSrߚͶBO!_U1±r?rm{fRYRCGQ JN;~քkX0_p.f?~=b\/>n?k<~]rFmʔ*[P8P+P _RYQa7&d蘒ʶmv*zZG„/YYL{-pnSضsq4JaQ Gj폋;Ja?_,.Kk[ZVE@rvj\ )5ظWeg(sMy| ?c͖{POfOVPojƬ)QO"d$lܧu57"ŁU拉|Gy|/EoY]6YZ,.!muǨr|8y΂@gMHȉhbXݤGb":uǨ}IԽi[r"ZGʇqj7EDڭ.!Sw*퀻E$V{nj&$EtEKJbO,X6jU(k6:}L ߗB )炒rNgɬ^7 #Uy\+Yv2IޤqӑkEd8=·;ѭ),r⾺?hn? cqRPK)o9pv7r\g7?UͺԀԏG?n]TC2{~r~ /;h~2v0mpGI Ux5hb9?ta#M^8s=8_k0bAX"BvkJj*K(C$f8QcȨ{A,o~`5x!27Bbu Z@]nPH(?C +ønwLL DkMe4sde " I)<2alBӝ~~M֫-)@5z0kâft-i#kk R"@(˼PZ(7<cT8j~?vC*\]vG/FfkU剰#؟,7h<;?` d {}|"^ |+k)Gn)#֦H-)qJIWRe"feQ;ۍ`ඟL҆zV[+WJf{n!ˉ0!H>SL@ce"BZ"]Y7+ ,vYb 2(@;@S֨ul^̪Rgg]d2j beň _KtERJ,QV(HsE% Oͨc:=ZT_w槨K90#=Ad ڙ?6fDlHO t+ XG9FնQѣ(pU_ hhҼ67 ɠCODSKh{Nn{h ?Lh!ZLw4|O-tkCU2ٴ1'zNMP ">Yˁnn}hNX}}zSqZd}x/wIF5p}XT~z?>-ےݻ?jP83G%TI൉#ݵJ&faW%rs͒n_s̒,Q ԪR:N2OD&fI;Wt*=*1f1it]}أVkz¹H]LC,'f^gȲN#jz2ZZpn ,Q!s-ga}\ xޜGG9Bz4ݪU#֖BGo8]l@Fs3Bgf!>EZ>E31zp̽abq|} Ɯv飯~\F՛zIpnxޖ-)%ϡ/^^|M%XI,hً7M!үpˋa 8Сdᗟtq4 &N4OyV~p>P 81( 0d:ɴ.TyżBnޱ88!vsJ5KpL9թuHM5%Y׃yWԶFSz:~SZi}hv1EMé GS%6Cr$]b@:kOՎi KnTuJ F7F pRV*7p-c+{-2oɶxʠa7>ClmF/P3ܸu-gԀ'հBvuny[Sz2!W(FDQoOC;k3D+d? WwZf6WUYAI"WBoG%˞Xם䜒O{eJP%VA$9dBV:۹ws?,s]]0q{[k}>SaO*m~T#%kyw#pd?OuO_y&2i{ٿ u߼~)$+Q~~F3 G[O4]`뿢rE$??t+=Uוi5K)|&ZIߏ(X9E] C)Xrڂ%r'U'Caϰl[ݣDN]8&E0b#7 (_#!cfOFm4+|AsМR֌mtj{cj[Xg*Q׻mzjOPm;5ΓVp8 HNLg*=Sf  w R7&$W5vC1ZrZ1hKS-i.#|A&!< cA\5OT0\гChh3^|yoJ۔ iJtf e%RT*9)Ov= :$ kO v>VF((E׊N)8܊g%OO.Au4P$5tR[͒w s~2~g%םӂ>V{bn}3c)kmbM,l6'f]짩Ϭ\'N᪣lhD֠A>|gX.į7{CZ[q^R^k>ۥw`0|yetP^skgmƖrl9^ 06[: W5Jò.ӟh !-bDKj >/0!U<+C4]O޻z䦼z*hqhb' tM.ӻ%cNU-;AL# sc]J4f5PkHz8.cvfHK[i0,AOG ڀyK5\'rTSݚ`# Yr(;F$.~]b7hKЏ5hy-W;;LT_wI- 11118ms (13:01:19.093) Feb 02 13:01:19 crc kubenswrapper[4721]: Trace[192135690]: [11.118650069s] [11.118650069s] END Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.093776 4721 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.093762 4721 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.093776 4721 trace.go:236] Trace[660090621]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 13:01:06.514) (total time: 12579ms): Feb 02 13:01:19 crc kubenswrapper[4721]: Trace[660090621]: ---"Objects listed" error: 12579ms (13:01:19.093) Feb 02 13:01:19 crc kubenswrapper[4721]: Trace[660090621]: [12.579577534s] [12.579577534s] END Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.093843 4721 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.093970 4721 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.094053 4721 trace.go:236] Trace[478236917]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 13:01:04.461) (total time: 14632ms): Feb 02 13:01:19 crc kubenswrapper[4721]: Trace[478236917]: ---"Objects listed" error: 14631ms (13:01:19.093) Feb 02 13:01:19 crc kubenswrapper[4721]: Trace[478236917]: [14.63208771s] [14.63208771s] END Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.094112 4721 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.095106 4721 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.105452 4721 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.119716 4721 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58242->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.119807 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:58242->192.168.126.11:17697: read: connection reset by peer" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.120552 4721 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.120675 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.172271 4721 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.172346 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.343941 4721 apiserver.go:52] "Watching apiserver" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.347984 4721 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.348240 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.348587 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.348670 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.348686 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.348800 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.348960 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.349017 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.349586 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.349634 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.349684 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.350815 4721 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.351129 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.351196 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.351356 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.351379 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.351567 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.351585 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.351601 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.351655 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.351920 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.362003 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 07:06:56.180539798 +0000 UTC Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.376174 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.385905 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395495 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395735 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395781 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395800 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395819 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395834 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395850 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395864 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395877 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395893 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395910 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395925 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395939 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395956 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395971 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.395988 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396004 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396021 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396042 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396104 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396123 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396142 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396162 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396170 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396184 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396239 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396262 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396284 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396308 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396329 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396351 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396374 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396397 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396420 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396443 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396459 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396471 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396492 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396517 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396541 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396565 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396590 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396613 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396635 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396659 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396686 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396712 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396733 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396756 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396778 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396801 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396824 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396845 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396867 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396888 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396909 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396964 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.396986 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397015 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397037 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397037 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397080 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397088 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397120 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397146 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397199 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397223 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397250 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397272 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397294 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397314 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397321 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397337 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397363 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397389 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397396 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397412 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397443 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397469 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397491 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397507 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397516 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397540 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397554 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397566 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397558 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397589 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397613 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397639 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397663 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397685 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397729 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397743 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397752 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397776 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397811 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397834 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397856 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397883 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397908 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397930 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397936 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397951 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.397984 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398039 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398084 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398093 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398089 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398107 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398165 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398166 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398188 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398212 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398231 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398241 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398247 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398280 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398300 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398318 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398314 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398336 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398355 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398374 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398392 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398411 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398416 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398414 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398433 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398429 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398535 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398575 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398583 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398618 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398642 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398670 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398707 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398721 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398730 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398755 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398754 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398774 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398783 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398801 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398812 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398841 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398867 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398889 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398944 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398963 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398978 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.398997 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399015 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399033 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399048 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399078 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399080 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399113 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399132 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399175 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399180 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399223 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399345 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.400526 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.400814 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401026 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401043 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401267 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401354 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401479 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.399352 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401581 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401611 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.402533 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403034 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403096 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403142 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403184 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403213 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403256 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403288 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403919 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403955 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404047 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404107 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404134 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404157 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404183 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404208 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404254 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404284 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404314 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404339 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404364 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404388 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404409 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404432 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404454 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404477 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404503 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404528 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404554 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404577 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404602 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404628 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404653 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404677 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404701 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404724 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404745 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404767 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404791 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404814 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404838 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404865 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404889 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404912 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404937 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404960 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404984 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405010 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405033 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405056 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405097 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405112 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405128 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405145 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405164 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405180 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405197 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405213 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405231 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405248 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405288 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405313 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405332 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405353 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405370 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405389 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405410 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405430 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405450 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405466 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405485 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405581 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405600 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405619 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405903 4721 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405920 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405931 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405942 4721 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405952 4721 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405964 4721 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405974 4721 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405983 4721 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405993 4721 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406003 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406014 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406024 4721 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406034 4721 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406044 4721 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406055 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406089 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406099 4721 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406109 4721 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406119 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406128 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406140 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406150 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406160 4721 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406170 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406180 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406190 4721 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406200 4721 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406210 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406219 4721 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406229 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406239 4721 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406249 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406259 4721 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406269 4721 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406279 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406291 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401781 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401798 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.402020 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.401378 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.402328 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.402343 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.412992 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.402779 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.402784 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403016 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403039 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403101 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403288 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403312 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403577 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403597 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403623 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.403207 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404054 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404208 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404223 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404264 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404354 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404382 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404621 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.404701 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405119 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405119 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405164 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405376 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405520 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405528 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405640 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405833 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405838 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405856 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.405928 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406080 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406276 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406330 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406362 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406537 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406554 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406652 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.406715 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.407081 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.407323 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.407398 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.407697 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.407811 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:01:19.907788418 +0000 UTC m=+20.210302807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.408186 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.408275 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.408320 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.408345 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.408493 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.408525 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.408737 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.408925 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.408900 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.409111 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.409222 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.409578 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.409664 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.410058 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.410813 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.411209 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.411220 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.411348 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.411446 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.411532 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.411653 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.411708 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.411795 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.411879 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.412238 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.412887 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.413302 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.413322 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.413529 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.413430 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.413886 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.413976 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.413988 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.414169 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.414292 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.414465 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.414551 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.414621 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.414731 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.414734 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.414850 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.414872 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.415011 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.415375 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.415370 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.415610 4721 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.416371 4721 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.416540 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:19.916513449 +0000 UTC m=+20.219027838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.416541 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.416844 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.417317 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.417393 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.417739 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.418209 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.418311 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.417775 4721 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.418603 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:19.918550463 +0000 UTC m=+20.221064852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.419624 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.419691 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.418868 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.421549 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.428013 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.428275 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.428688 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.428712 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.428727 4721 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.428781 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:19.928765534 +0000 UTC m=+20.231280003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.428878 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.429113 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.429135 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.429146 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.429128 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.429207 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.429248 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.429668 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.429711 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.430348 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.430637 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.430964 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.431764 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.431812 4721 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.431886 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:19.931867847 +0000 UTC m=+20.234382236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.433588 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.433654 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.434091 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.434509 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.434642 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.434672 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.434722 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.434746 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.434968 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.435134 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.435230 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.435487 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.435544 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.435705 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.436149 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.436295 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.436857 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.437467 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.437823 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.438119 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.438487 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.438581 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.438615 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.438833 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.439117 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.439261 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.439430 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.439988 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.441412 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.441554 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.441708 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.441745 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.441994 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.442048 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.443010 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.444432 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.445044 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.445057 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.445102 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.445252 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.446334 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.446449 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.446957 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.447277 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.447378 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.447722 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.452130 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.453412 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.463054 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.464088 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.465062 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.470802 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.507480 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.507685 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.507738 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.507785 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.507751 4721 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.507815 4721 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.507828 4721 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.507948 4721 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508095 4721 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508111 4721 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508122 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508133 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508143 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508152 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508163 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508172 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508180 4721 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508188 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508220 4721 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508229 4721 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508237 4721 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508245 4721 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508270 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508282 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508330 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508348 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508359 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508373 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508383 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508394 4721 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508405 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508418 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508429 4721 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508440 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508450 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508461 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508471 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508482 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508491 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508513 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508524 4721 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508534 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508544 4721 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508554 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508564 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508574 4721 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508584 4721 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508594 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508605 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508619 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508631 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508642 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508653 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508667 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508686 4721 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508697 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508709 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508720 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508731 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508766 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508778 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508790 4721 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508802 4721 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508813 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508825 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508839 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508851 4721 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508874 4721 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508887 4721 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508898 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508909 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508921 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508932 4721 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508943 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508955 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508966 4721 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508976 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508988 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.508999 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509009 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509020 4721 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509032 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509043 4721 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509055 4721 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509084 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509096 4721 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509108 4721 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509120 4721 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509131 4721 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509142 4721 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509153 4721 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509163 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509175 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509185 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509196 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509206 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509217 4721 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509228 4721 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509239 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509256 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509267 4721 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509284 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509294 4721 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509305 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509315 4721 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509326 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509336 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509347 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509357 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509375 4721 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509387 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509397 4721 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509408 4721 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509418 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509429 4721 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509440 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509451 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509462 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509473 4721 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509483 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509494 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509505 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509517 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509529 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509541 4721 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509551 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509562 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509574 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509585 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509596 4721 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509606 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509622 4721 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509637 4721 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509648 4721 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509660 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509670 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509681 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509691 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509702 4721 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509713 4721 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509723 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509734 4721 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509745 4721 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509757 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509769 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509780 4721 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509790 4721 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509802 4721 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509813 4721 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509823 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509834 4721 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509844 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509854 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509864 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509876 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509887 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509899 4721 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509910 4721 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509921 4721 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509933 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509945 4721 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.509957 4721 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.535893 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.538031 4721 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d" exitCode=255 Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.538114 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d"} Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.548943 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.550773 4721 scope.go:117] "RemoveContainer" containerID="db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.551470 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.561319 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.576185 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.586113 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.595562 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.607441 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.662995 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.668373 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.675975 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 13:01:19 crc kubenswrapper[4721]: W0202 13:01:19.692422 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-19f26c18af962b39775d2f11ae06ed7c602e93044b08d29c2f05ec27c97ef84b WatchSource:0}: Error finding container 19f26c18af962b39775d2f11ae06ed7c602e93044b08d29c2f05ec27c97ef84b: Status 404 returned error can't find the container with id 19f26c18af962b39775d2f11ae06ed7c602e93044b08d29c2f05ec27c97ef84b Feb 02 13:01:19 crc kubenswrapper[4721]: I0202 13:01:19.913723 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:19 crc kubenswrapper[4721]: E0202 13:01:19.913897 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:01:20.913863996 +0000 UTC m=+21.216378385 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.015163 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.015282 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.015322 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.015502 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015507 4721 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015625 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015654 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015673 4721 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015686 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:21.015662018 +0000 UTC m=+21.318176427 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015714 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:21.015700439 +0000 UTC m=+21.318214838 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015797 4721 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015835 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:21.015825372 +0000 UTC m=+21.318339771 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015924 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015945 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015956 4721 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.015988 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:21.015979396 +0000 UTC m=+21.318493795 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.363134 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 02:44:05.478460878 +0000 UTC Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.409525 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.409697 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.413435 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.414184 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.415281 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.415916 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.416865 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.417399 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.417999 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.418939 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.419600 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.420601 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.421239 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.422288 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.422741 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.423238 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.424151 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.424628 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.425628 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.426105 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.426810 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.428029 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.428687 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.429653 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.430230 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.431158 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.431779 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.433836 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.435318 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.435950 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.436673 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.437830 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.438511 4721 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.438709 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.441139 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.441928 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.442683 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.444887 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.445035 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.445856 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.446517 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.447887 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.449415 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.449928 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.450760 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.451917 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.452964 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.453550 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.454618 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.455347 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.456534 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.457117 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.458013 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.458682 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.459357 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.460611 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.461246 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.466982 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.488913 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.514456 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.542686 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443"} Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.542744 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bdabbc19c77153242b2b6ba0235517389b7b070064dfcaddbe90f8cdd80faf89"} Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.544909 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.546656 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.546921 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e"} Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.547659 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.548764 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"19f26c18af962b39775d2f11ae06ed7c602e93044b08d29c2f05ec27c97ef84b"} Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.551347 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac"} Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.551376 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68"} Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.551386 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d0934c59dd2f80b7ce756beac48bdb978f52b606f888232b5602aa31229ef731"} Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.569295 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.587215 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.603624 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.617630 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.632429 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.646239 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.665027 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.681668 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.698485 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:20 crc kubenswrapper[4721]: I0202 13:01:20.924056 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:20 crc kubenswrapper[4721]: E0202 13:01:20.924244 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:01:22.924224977 +0000 UTC m=+23.226739366 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:01:21 crc kubenswrapper[4721]: I0202 13:01:21.024756 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:21 crc kubenswrapper[4721]: I0202 13:01:21.024812 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:21 crc kubenswrapper[4721]: I0202 13:01:21.024834 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:21 crc kubenswrapper[4721]: I0202 13:01:21.024855 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.024934 4721 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.024947 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.024969 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.024982 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:23.02496882 +0000 UTC m=+23.327483209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.024982 4721 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.025015 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:23.025006291 +0000 UTC m=+23.327520680 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.025033 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.025053 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.025056 4721 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.025078 4721 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.025094 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:23.025085973 +0000 UTC m=+23.327600362 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.025109 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:23.025102044 +0000 UTC m=+23.327616443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:21 crc kubenswrapper[4721]: I0202 13:01:21.364048 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 05:08:03.944200517 +0000 UTC Feb 02 13:01:21 crc kubenswrapper[4721]: I0202 13:01:21.408960 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:21 crc kubenswrapper[4721]: I0202 13:01:21.409113 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.409154 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:21 crc kubenswrapper[4721]: E0202 13:01:21.409359 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.364601 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 07:33:01.891982329 +0000 UTC Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.409337 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:22 crc kubenswrapper[4721]: E0202 13:01:22.409562 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.560001 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221"} Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.578212 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.597428 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.621524 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.637757 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.654403 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.676582 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.697600 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:22 crc kubenswrapper[4721]: I0202 13:01:22.941778 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:22 crc kubenswrapper[4721]: E0202 13:01:22.941930 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:01:26.941909546 +0000 UTC m=+27.244423945 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.042332 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.042385 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.042421 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.042469 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042526 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042560 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042563 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042579 4721 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042583 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042595 4721 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042647 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:27.042622409 +0000 UTC m=+27.345136808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042657 4721 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042675 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:27.04266591 +0000 UTC m=+27.345180309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042787 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:27.042758903 +0000 UTC m=+27.345273352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042670 4721 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.042852 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:27.042838755 +0000 UTC m=+27.345353224 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.365660 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 15:30:56.03846493 +0000 UTC Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.409693 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.409754 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.409942 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:23 crc kubenswrapper[4721]: E0202 13:01:23.410137 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.641415 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.659362 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.663653 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.664262 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.680740 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.699032 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.714233 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.729605 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.743052 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.756867 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.782639 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.798783 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.813738 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.826031 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.842887 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.856723 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.871275 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:23 crc kubenswrapper[4721]: I0202 13:01:23.885109 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:23Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:24 crc kubenswrapper[4721]: I0202 13:01:24.366756 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 11:32:26.322880796 +0000 UTC Feb 02 13:01:24 crc kubenswrapper[4721]: I0202 13:01:24.409613 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:24 crc kubenswrapper[4721]: E0202 13:01:24.409856 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:24 crc kubenswrapper[4721]: I0202 13:01:24.995223 4721 csr.go:261] certificate signing request csr-pcbw8 is approved, waiting to be issued Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.010641 4721 csr.go:257] certificate signing request csr-pcbw8 is issued Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.090999 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-sgp8m"] Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.091458 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sgp8m" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.097481 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.098055 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.098498 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.145654 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.161665 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13f3ec54-a7fb-4236-9583-827d960b2086-hosts-file\") pod \"node-resolver-sgp8m\" (UID: \"13f3ec54-a7fb-4236-9583-827d960b2086\") " pod="openshift-dns/node-resolver-sgp8m" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.161712 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n42qk\" (UniqueName: \"kubernetes.io/projected/13f3ec54-a7fb-4236-9583-827d960b2086-kube-api-access-n42qk\") pod \"node-resolver-sgp8m\" (UID: \"13f3ec54-a7fb-4236-9583-827d960b2086\") " pod="openshift-dns/node-resolver-sgp8m" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.174793 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.193300 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.212740 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.229083 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.241358 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.260269 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.262622 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13f3ec54-a7fb-4236-9583-827d960b2086-hosts-file\") pod \"node-resolver-sgp8m\" (UID: \"13f3ec54-a7fb-4236-9583-827d960b2086\") " pod="openshift-dns/node-resolver-sgp8m" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.262669 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n42qk\" (UniqueName: \"kubernetes.io/projected/13f3ec54-a7fb-4236-9583-827d960b2086-kube-api-access-n42qk\") pod \"node-resolver-sgp8m\" (UID: \"13f3ec54-a7fb-4236-9583-827d960b2086\") " pod="openshift-dns/node-resolver-sgp8m" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.262758 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/13f3ec54-a7fb-4236-9583-827d960b2086-hosts-file\") pod \"node-resolver-sgp8m\" (UID: \"13f3ec54-a7fb-4236-9583-827d960b2086\") " pod="openshift-dns/node-resolver-sgp8m" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.274019 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.284320 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n42qk\" (UniqueName: \"kubernetes.io/projected/13f3ec54-a7fb-4236-9583-827d960b2086-kube-api-access-n42qk\") pod \"node-resolver-sgp8m\" (UID: \"13f3ec54-a7fb-4236-9583-827d960b2086\") " pod="openshift-dns/node-resolver-sgp8m" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.289148 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.367103 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 17:23:24.279180963 +0000 UTC Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.404615 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sgp8m" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.408825 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.408880 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:25 crc kubenswrapper[4721]: E0202 13:01:25.408990 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:25 crc kubenswrapper[4721]: E0202 13:01:25.409174 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:25 crc kubenswrapper[4721]: W0202 13:01:25.415526 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13f3ec54_a7fb_4236_9583_827d960b2086.slice/crio-18c27a1729151a5401a1154b35ef853fa559bec7a8014eec078050188649df3a WatchSource:0}: Error finding container 18c27a1729151a5401a1154b35ef853fa559bec7a8014eec078050188649df3a: Status 404 returned error can't find the container with id 18c27a1729151a5401a1154b35ef853fa559bec7a8014eec078050188649df3a Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.495475 4721 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.497240 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.497289 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.497304 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.497381 4721 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.504703 4721 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.505026 4721 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.506236 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.506273 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.506285 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.506303 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.506314 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:25Z","lastTransitionTime":"2026-02-02T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:25 crc kubenswrapper[4721]: E0202 13:01:25.523961 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.527511 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.527557 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.527571 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.527588 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.527601 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:25Z","lastTransitionTime":"2026-02-02T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:25 crc kubenswrapper[4721]: E0202 13:01:25.538685 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.542532 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.542572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.542584 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.542600 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.542611 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:25Z","lastTransitionTime":"2026-02-02T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:25 crc kubenswrapper[4721]: E0202 13:01:25.556295 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.561218 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.561270 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.561285 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.561306 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.561326 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:25Z","lastTransitionTime":"2026-02-02T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.569780 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sgp8m" event={"ID":"13f3ec54-a7fb-4236-9583-827d960b2086","Type":"ContainerStarted","Data":"18c27a1729151a5401a1154b35ef853fa559bec7a8014eec078050188649df3a"} Feb 02 13:01:25 crc kubenswrapper[4721]: E0202 13:01:25.577245 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.581024 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.581048 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.581055 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.581083 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.581092 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:25Z","lastTransitionTime":"2026-02-02T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:25 crc kubenswrapper[4721]: E0202 13:01:25.610632 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:25Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:25 crc kubenswrapper[4721]: E0202 13:01:25.610744 4721 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.612330 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.612358 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.612367 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.612383 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.612394 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:25Z","lastTransitionTime":"2026-02-02T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.715287 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.715339 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.715349 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.715364 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.715375 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:25Z","lastTransitionTime":"2026-02-02T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.818272 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.818329 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.818340 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.818355 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.818365 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:25Z","lastTransitionTime":"2026-02-02T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.921377 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.921433 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.921452 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.921473 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:25 crc kubenswrapper[4721]: I0202 13:01:25.921489 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:25Z","lastTransitionTime":"2026-02-02T13:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.009235 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-ltw7d"] Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.009548 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-x4lhg"] Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.009698 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.010111 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.011356 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.011669 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 02 13:01:26 crc kubenswrapper[4721]: W0202 13:01:26.014215 4721 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 02 13:01:26 crc kubenswrapper[4721]: E0202 13:01:26.014263 4721 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 13:01:26 crc kubenswrapper[4721]: W0202 13:01:26.014314 4721 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 02 13:01:26 crc kubenswrapper[4721]: E0202 13:01:26.014328 4721 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.014440 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.014882 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-rppjz"] Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.015174 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.015187 4721 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-02 12:56:25 +0000 UTC, rotation deadline is 2026-11-02 08:24:05.45723808 +0000 UTC Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.015223 4721 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6547h22m39.442018442s for next certificate rotation Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.015298 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.015450 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.015548 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pwcs2"] Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.016283 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.018009 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.019133 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.019334 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.023657 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.023700 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.023714 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.023747 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.023761 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.024194 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.024194 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.024261 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.024363 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.024476 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.025834 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.027934 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.027988 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.028018 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.041668 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.067767 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.080506 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.097347 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.118578 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.126006 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.126050 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.126078 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.126096 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.126110 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.150629 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170215 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-os-release\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170256 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-var-lib-openvswitch\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170273 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-var-lib-cni-multus\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170286 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-var-lib-kubelet\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170348 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-cnibin\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170407 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88f6w\" (UniqueName: \"kubernetes.io/projected/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-kube-api-access-88f6w\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170456 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm6j6\" (UniqueName: \"kubernetes.io/projected/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-kube-api-access-nm6j6\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170507 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170533 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-system-cni-dir\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170561 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-system-cni-dir\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170583 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-var-lib-cni-bin\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170606 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tcn\" (UniqueName: \"kubernetes.io/projected/04b1629d-0184-4975-8d4b-7a32913e7389-kube-api-access-h2tcn\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170627 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-mcd-auth-proxy-config\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170649 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-cnibin\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170670 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-socket-dir-parent\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170692 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-etc-kubernetes\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170717 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-etc-openvswitch\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170740 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-ovn\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170770 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-env-overrides\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170792 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-run-k8s-cni-cncf-io\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170811 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-daemon-config\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170827 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04b1629d-0184-4975-8d4b-7a32913e7389-cni-binary-copy\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170842 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-systemd-units\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170856 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-systemd\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170872 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-cni-dir\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170886 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-run-netns\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170903 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-proxy-tls\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170923 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-netd\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170937 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04b1629d-0184-4975-8d4b-7a32913e7389-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170951 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-netns\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170967 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-log-socket\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.170981 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-config\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171000 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovn-node-metrics-cert\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171030 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-run-multus-certs\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171055 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-rootfs\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171088 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-slash\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171122 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-bin\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171142 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dchqf\" (UniqueName: \"kubernetes.io/projected/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-kube-api-access-dchqf\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171162 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171186 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-script-lib\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171208 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171228 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-kubelet\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171247 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-node-log\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171264 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-cni-binary-copy\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171290 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-os-release\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171307 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-hostroot\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171328 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-conf-dir\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.171357 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-openvswitch\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.178986 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.179108 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.187925 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.192555 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.228772 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.228807 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.228817 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.228834 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.228847 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.246454 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.266500 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272266 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-slash\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272313 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-bin\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272336 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dchqf\" (UniqueName: \"kubernetes.io/projected/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-kube-api-access-dchqf\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272363 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272388 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-script-lib\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272419 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272440 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-kubelet\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272458 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-node-log\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272477 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-cni-binary-copy\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272507 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-openvswitch\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272527 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-os-release\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272545 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-hostroot\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272564 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-conf-dir\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272590 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-cnibin\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272612 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-os-release\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272632 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-var-lib-openvswitch\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272652 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-var-lib-cni-multus\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272672 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-var-lib-kubelet\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272699 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88f6w\" (UniqueName: \"kubernetes.io/projected/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-kube-api-access-88f6w\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272723 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm6j6\" (UniqueName: \"kubernetes.io/projected/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-kube-api-access-nm6j6\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272749 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-system-cni-dir\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272773 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272797 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-system-cni-dir\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272820 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tcn\" (UniqueName: \"kubernetes.io/projected/04b1629d-0184-4975-8d4b-7a32913e7389-kube-api-access-h2tcn\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272843 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-var-lib-cni-bin\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272862 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-cnibin\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272885 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-socket-dir-parent\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272905 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-etc-kubernetes\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272928 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-mcd-auth-proxy-config\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272950 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-etc-openvswitch\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272972 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04b1629d-0184-4975-8d4b-7a32913e7389-cni-binary-copy\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.272992 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-systemd-units\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273012 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-ovn\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273032 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-env-overrides\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273056 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-run-k8s-cni-cncf-io\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273098 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-daemon-config\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273120 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-systemd\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273140 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-cni-dir\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273161 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-run-netns\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273182 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-proxy-tls\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273202 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-netd\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273223 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04b1629d-0184-4975-8d4b-7a32913e7389-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273244 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-netns\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273265 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-config\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273286 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-log-socket\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273305 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovn-node-metrics-cert\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273327 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-run-multus-certs\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273377 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-rootfs\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273456 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-rootfs\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273496 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-slash\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273527 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-bin\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273745 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-ovn-kubernetes\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.273952 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-cnibin\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274102 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-var-lib-openvswitch\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274129 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-hostroot\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274152 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-conf-dir\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274161 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-os-release\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274186 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-os-release\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274198 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-kubelet\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274210 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-systemd\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274226 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-node-log\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274244 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-socket-dir-parent\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274265 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-etc-kubernetes\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274271 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-var-lib-cni-multus\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274328 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-system-cni-dir\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274274 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-cnibin\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274400 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274413 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-var-lib-kubelet\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274479 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-system-cni-dir\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274479 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-script-lib\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274688 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-run-netns\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274684 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-cni-dir\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274723 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-ovn\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274762 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-netd\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274777 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-cni-binary-copy\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274818 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-openvswitch\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274834 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-netns\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274840 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-mcd-auth-proxy-config\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274847 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-var-lib-cni-bin\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274874 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-run-multus-certs\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274895 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-host-run-k8s-cni-cncf-io\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.274968 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-log-socket\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.275140 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-systemd-units\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.275199 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-etc-openvswitch\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.275229 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-env-overrides\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.275309 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04b1629d-0184-4975-8d4b-7a32913e7389-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.275444 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-multus-daemon-config\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.275517 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-config\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.275882 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04b1629d-0184-4975-8d4b-7a32913e7389-cni-binary-copy\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.281047 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovn-node-metrics-cert\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.282674 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-proxy-tls\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.293703 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm6j6\" (UniqueName: \"kubernetes.io/projected/bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877-kube-api-access-nm6j6\") pod \"machine-config-daemon-rppjz\" (UID: \"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\") " pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.295582 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dchqf\" (UniqueName: \"kubernetes.io/projected/5ba84858-caaa-4fba-8eaf-9f7ddece0b3a-kube-api-access-dchqf\") pod \"multus-ltw7d\" (UID: \"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\") " pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.299254 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.301878 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88f6w\" (UniqueName: \"kubernetes.io/projected/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-kube-api-access-88f6w\") pod \"ovnkube-node-pwcs2\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.304101 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tcn\" (UniqueName: \"kubernetes.io/projected/04b1629d-0184-4975-8d4b-7a32913e7389-kube-api-access-h2tcn\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.319662 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.324958 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ltw7d" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.330723 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.330909 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.330970 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.331035 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.331115 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.336150 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: W0202 13:01:26.337339 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba84858_caaa_4fba_8eaf_9f7ddece0b3a.slice/crio-7c1e0a9b51e6a17d1d4ddea80a323e586d38fb5a143aeb26bc485a08471c30fd WatchSource:0}: Error finding container 7c1e0a9b51e6a17d1d4ddea80a323e586d38fb5a143aeb26bc485a08471c30fd: Status 404 returned error can't find the container with id 7c1e0a9b51e6a17d1d4ddea80a323e586d38fb5a143aeb26bc485a08471c30fd Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.343318 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.352511 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.353876 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: W0202 13:01:26.364202 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc8c3bf4_0f02_47a1_b8b8_1e40a8daa877.slice/crio-088d31a818c8476f9d984933c4a955becb0ddb7e01581d5b834cf1979d30cd8a WatchSource:0}: Error finding container 088d31a818c8476f9d984933c4a955becb0ddb7e01581d5b834cf1979d30cd8a: Status 404 returned error can't find the container with id 088d31a818c8476f9d984933c4a955becb0ddb7e01581d5b834cf1979d30cd8a Feb 02 13:01:26 crc kubenswrapper[4721]: W0202 13:01:26.366612 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb15bc48d_f88d_4b38_a9e1_00bb00b88a52.slice/crio-1aec1a12b2d4ba708a47d40a6a1a8e146d69e8b9d3bc97bc1257a9e8fc573862 WatchSource:0}: Error finding container 1aec1a12b2d4ba708a47d40a6a1a8e146d69e8b9d3bc97bc1257a9e8fc573862: Status 404 returned error can't find the container with id 1aec1a12b2d4ba708a47d40a6a1a8e146d69e8b9d3bc97bc1257a9e8fc573862 Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.368726 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 22:11:50.765167107 +0000 UTC Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.376873 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.392060 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.405734 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.408610 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:26 crc kubenswrapper[4721]: E0202 13:01:26.408735 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.420277 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.434130 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.434181 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.434194 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.434215 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.434228 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.440639 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.456450 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.472049 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.483817 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.497538 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.509011 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.528428 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.536795 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.536839 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.536852 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.536870 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.536882 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.574354 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltw7d" event={"ID":"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a","Type":"ContainerStarted","Data":"0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.575119 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltw7d" event={"ID":"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a","Type":"ContainerStarted","Data":"7c1e0a9b51e6a17d1d4ddea80a323e586d38fb5a143aeb26bc485a08471c30fd"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.575674 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sgp8m" event={"ID":"13f3ec54-a7fb-4236-9583-827d960b2086","Type":"ContainerStarted","Data":"ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.577005 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b" exitCode=0 Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.577083 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.577107 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"1aec1a12b2d4ba708a47d40a6a1a8e146d69e8b9d3bc97bc1257a9e8fc573862"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.579041 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.579083 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"088d31a818c8476f9d984933c4a955becb0ddb7e01581d5b834cf1979d30cd8a"} Feb 02 13:01:26 crc kubenswrapper[4721]: E0202 13:01:26.585252 4721 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.598173 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.616423 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.634890 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.638572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.638604 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.638612 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.638626 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.638637 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.648552 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.661979 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.675294 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.687734 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.707749 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.722873 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.741396 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.741436 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.741446 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.741459 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.741475 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.742456 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.758890 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.782291 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.794721 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.805782 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.818940 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.832530 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.844059 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.844123 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.844134 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.844151 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.844164 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.845125 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.869824 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.886621 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.900304 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.913358 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.929620 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.944526 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.946107 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.946133 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.946141 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.946155 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.946165 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:26Z","lastTransitionTime":"2026-02-02T13:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.959775 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.975731 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.977894 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:26 crc kubenswrapper[4721]: E0202 13:01:26.978056 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:01:34.978030956 +0000 UTC m=+35.280545345 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:01:26 crc kubenswrapper[4721]: I0202 13:01:26.989943 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:26Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.007091 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.026519 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.048572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.048798 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.048875 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.048944 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.049002 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.078496 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.078710 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.078659 4721 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.078809 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:35.07879143 +0000 UTC m=+35.381305819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.078955 4721 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.079077 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:35.079039916 +0000 UTC m=+35.381554355 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.079198 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.079227 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.079322 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.079394 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.079432 4721 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.079471 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:35.079462788 +0000 UTC m=+35.381977177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.079477 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.079516 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.079531 4721 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.079590 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:35.079573101 +0000 UTC m=+35.382087490 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.151640 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.151674 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.151683 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.151697 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.151707 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.253852 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.253919 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.253928 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.253942 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.253951 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.275570 4721 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.275948 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/04b1629d-0184-4975-8d4b-7a32913e7389-cni-sysctl-allowlist podName:04b1629d-0184-4975-8d4b-7a32913e7389 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:27.775922841 +0000 UTC m=+28.078437230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/04b1629d-0184-4975-8d4b-7a32913e7389-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-x4lhg" (UID: "04b1629d-0184-4975-8d4b-7a32913e7389") : failed to sync configmap cache: timed out waiting for the condition Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.356335 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.356378 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.356388 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.356405 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.356418 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.369777 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 10:04:00.727329603 +0000 UTC Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.380090 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.385373 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.408840 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.408949 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.409120 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:27 crc kubenswrapper[4721]: E0202 13:01:27.409258 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.458761 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.458815 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.458831 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.458857 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.458873 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.562365 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.562904 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.562915 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.562934 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.562947 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.592693 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.592753 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.592768 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.592781 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.592794 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.592806 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.594851 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.610395 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.622676 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.644556 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.658449 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.665568 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.665626 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.665640 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.665657 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.665670 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.675055 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.688610 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.702875 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.718432 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.732325 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.742503 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.759755 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.768223 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.768302 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.768315 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.768334 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.768348 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.773146 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.787553 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.790196 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04b1629d-0184-4975-8d4b-7a32913e7389-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.790827 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/04b1629d-0184-4975-8d4b-7a32913e7389-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x4lhg\" (UID: \"04b1629d-0184-4975-8d4b-7a32913e7389\") " pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.802668 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:27Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.834543 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.873056 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.873137 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.873147 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.873161 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.873172 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.978821 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.978875 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.978885 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.978907 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:27 crc kubenswrapper[4721]: I0202 13:01:27.978921 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:27Z","lastTransitionTime":"2026-02-02T13:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.081667 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.081721 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.081735 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.081754 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.081768 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:28Z","lastTransitionTime":"2026-02-02T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.184044 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.184461 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.184471 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.184489 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.184499 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:28Z","lastTransitionTime":"2026-02-02T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.288221 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.288280 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.288293 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.288313 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.288323 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:28Z","lastTransitionTime":"2026-02-02T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.370360 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 02:36:32.808931209 +0000 UTC Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.390705 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.390753 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.390763 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.390784 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.390796 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:28Z","lastTransitionTime":"2026-02-02T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.409486 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:28 crc kubenswrapper[4721]: E0202 13:01:28.409616 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.493598 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.493639 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.493648 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.493663 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.493672 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:28Z","lastTransitionTime":"2026-02-02T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.596800 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.596838 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.596847 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.596862 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.596874 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:28Z","lastTransitionTime":"2026-02-02T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.599691 4721 generic.go:334] "Generic (PLEG): container finished" podID="04b1629d-0184-4975-8d4b-7a32913e7389" containerID="300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4" exitCode=0 Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.599789 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" event={"ID":"04b1629d-0184-4975-8d4b-7a32913e7389","Type":"ContainerDied","Data":"300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.599856 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" event={"ID":"04b1629d-0184-4975-8d4b-7a32913e7389","Type":"ContainerStarted","Data":"e357aed1992a77065bb25095d84e47a76f6e642eb4294de7abc9e22b6f097e1f"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.625712 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.642600 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.656012 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.672560 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.687916 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.696150 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-48kgl"] Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.696574 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.698048 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.698261 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.698462 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.699466 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.699734 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.699770 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.699782 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.699800 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.699814 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:28Z","lastTransitionTime":"2026-02-02T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.704414 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.717712 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.738408 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.753021 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.766875 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.780421 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.799764 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.800426 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09b4ea41-ceb5-481a-899e-c2876ced6d49-host\") pod \"node-ca-48kgl\" (UID: \"09b4ea41-ceb5-481a-899e-c2876ced6d49\") " pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.800472 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09b4ea41-ceb5-481a-899e-c2876ced6d49-serviceca\") pod \"node-ca-48kgl\" (UID: \"09b4ea41-ceb5-481a-899e-c2876ced6d49\") " pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.800517 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6j7\" (UniqueName: \"kubernetes.io/projected/09b4ea41-ceb5-481a-899e-c2876ced6d49-kube-api-access-8p6j7\") pod \"node-ca-48kgl\" (UID: \"09b4ea41-ceb5-481a-899e-c2876ced6d49\") " pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.801956 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.801978 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.801986 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.802002 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.802012 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:28Z","lastTransitionTime":"2026-02-02T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.813606 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.825574 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.845179 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.858026 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.869343 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.883267 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.896157 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.901937 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09b4ea41-ceb5-481a-899e-c2876ced6d49-host\") pod \"node-ca-48kgl\" (UID: \"09b4ea41-ceb5-481a-899e-c2876ced6d49\") " pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.902027 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09b4ea41-ceb5-481a-899e-c2876ced6d49-host\") pod \"node-ca-48kgl\" (UID: \"09b4ea41-ceb5-481a-899e-c2876ced6d49\") " pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.902161 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09b4ea41-ceb5-481a-899e-c2876ced6d49-serviceca\") pod \"node-ca-48kgl\" (UID: \"09b4ea41-ceb5-481a-899e-c2876ced6d49\") " pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.903409 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09b4ea41-ceb5-481a-899e-c2876ced6d49-serviceca\") pod \"node-ca-48kgl\" (UID: \"09b4ea41-ceb5-481a-899e-c2876ced6d49\") " pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.903458 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6j7\" (UniqueName: \"kubernetes.io/projected/09b4ea41-ceb5-481a-899e-c2876ced6d49-kube-api-access-8p6j7\") pod \"node-ca-48kgl\" (UID: \"09b4ea41-ceb5-481a-899e-c2876ced6d49\") " pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.903828 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.903877 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.903889 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.903909 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.903922 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:28Z","lastTransitionTime":"2026-02-02T13:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.910249 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.921811 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6j7\" (UniqueName: \"kubernetes.io/projected/09b4ea41-ceb5-481a-899e-c2876ced6d49-kube-api-access-8p6j7\") pod \"node-ca-48kgl\" (UID: \"09b4ea41-ceb5-481a-899e-c2876ced6d49\") " pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.922375 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.937564 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.950170 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.964153 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:28 crc kubenswrapper[4721]: I0202 13:01:28.986848 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.001569 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:28Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.006508 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.006559 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.006578 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.006601 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.006619 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.013749 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-48kgl" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.018950 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: W0202 13:01:29.030851 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09b4ea41_ceb5_481a_899e_c2876ced6d49.slice/crio-394544a1b172ae42c05f9a363db8b77471aa0ca9e90527859d81be99582e16f1 WatchSource:0}: Error finding container 394544a1b172ae42c05f9a363db8b77471aa0ca9e90527859d81be99582e16f1: Status 404 returned error can't find the container with id 394544a1b172ae42c05f9a363db8b77471aa0ca9e90527859d81be99582e16f1 Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.037756 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.048901 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.108703 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.108741 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.108753 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.108769 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.108782 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.227625 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.232653 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.232709 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.232729 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.232740 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.335635 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.335678 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.335689 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.335706 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.335717 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.373027 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 03:30:55.682887488 +0000 UTC Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.409486 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.409599 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:29 crc kubenswrapper[4721]: E0202 13:01:29.409765 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:29 crc kubenswrapper[4721]: E0202 13:01:29.409616 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.439281 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.439339 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.439351 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.439376 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.439391 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.542086 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.542143 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.542153 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.542174 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.542187 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.606003 4721 generic.go:334] "Generic (PLEG): container finished" podID="04b1629d-0184-4975-8d4b-7a32913e7389" containerID="1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb" exitCode=0 Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.606119 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" event={"ID":"04b1629d-0184-4975-8d4b-7a32913e7389","Type":"ContainerDied","Data":"1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.612051 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.613511 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-48kgl" event={"ID":"09b4ea41-ceb5-481a-899e-c2876ced6d49","Type":"ContainerStarted","Data":"cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.613540 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-48kgl" event={"ID":"09b4ea41-ceb5-481a-899e-c2876ced6d49","Type":"ContainerStarted","Data":"394544a1b172ae42c05f9a363db8b77471aa0ca9e90527859d81be99582e16f1"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.624872 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.641970 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.644757 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.644803 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.644815 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.644833 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.644845 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.660331 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.675474 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.690429 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.704593 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.717205 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.734105 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.750600 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.750637 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.750646 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.750661 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.750670 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.752546 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.773773 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.787320 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.806905 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.825193 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.838821 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.852874 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.854117 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.854149 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.854159 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.854174 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.854186 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.870377 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.885461 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.899319 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.911624 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.930143 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.943437 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.956983 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.959148 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.959168 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.959178 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.959195 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.959205 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:29Z","lastTransitionTime":"2026-02-02T13:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.971781 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:01:29 crc kubenswrapper[4721]: I0202 13:01:29.982806 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:29Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.002347 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.015433 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.027821 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.042367 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.053739 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.061928 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.062429 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.062442 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.062462 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.062478 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:30Z","lastTransitionTime":"2026-02-02T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.066835 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.085754 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.109032 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.122030 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.133791 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.145805 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.164126 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.166190 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.166252 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.166267 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.166311 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.166328 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:30Z","lastTransitionTime":"2026-02-02T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.177399 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.191184 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.212461 4721 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.212546 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.263635 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.282010 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.286276 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.286314 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.286325 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.286345 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.286358 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:30Z","lastTransitionTime":"2026-02-02T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.310517 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.332102 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.352351 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.369122 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.373855 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 09:18:23.560043886 +0000 UTC Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.385060 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.389587 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.389620 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.389632 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.389649 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.389661 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:30Z","lastTransitionTime":"2026-02-02T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.411844 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:30 crc kubenswrapper[4721]: E0202 13:01:30.412104 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.432724 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.447423 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.459678 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.474623 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.491894 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.493661 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.493696 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.493708 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.493725 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.493735 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:30Z","lastTransitionTime":"2026-02-02T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.506441 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.540593 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.582901 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.595715 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.595748 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.595758 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.595773 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.595783 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:30Z","lastTransitionTime":"2026-02-02T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.619616 4721 generic.go:334] "Generic (PLEG): container finished" podID="04b1629d-0184-4975-8d4b-7a32913e7389" containerID="7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b" exitCode=0 Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.619667 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" event={"ID":"04b1629d-0184-4975-8d4b-7a32913e7389","Type":"ContainerDied","Data":"7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.622875 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.662759 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.698101 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.698136 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.698146 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.698160 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.698169 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:30Z","lastTransitionTime":"2026-02-02T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.711312 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.742550 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.780808 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.800537 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.800578 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.800587 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.800604 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.800613 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:30Z","lastTransitionTime":"2026-02-02T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.820265 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.861240 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.901569 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.902695 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.902722 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.902733 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.902748 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.902759 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:30Z","lastTransitionTime":"2026-02-02T13:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.942297 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:30 crc kubenswrapper[4721]: I0202 13:01:30.981851 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:30Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.005670 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.005713 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.005724 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.005740 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.005751 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.021545 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.065274 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.109863 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.109939 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.109965 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.109998 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.110024 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.110934 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.143311 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.189597 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.212769 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.212823 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.212835 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.212890 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.212908 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.226149 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.262732 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.304097 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.315700 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.315761 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.315777 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.315797 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.315810 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.342750 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.374782 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 15:11:06.699857754 +0000 UTC Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.390705 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.409084 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.409107 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:31 crc kubenswrapper[4721]: E0202 13:01:31.409288 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:31 crc kubenswrapper[4721]: E0202 13:01:31.409429 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.419745 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.419787 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.419797 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.419816 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.419826 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.422557 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.460723 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.521833 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.521880 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.521889 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.521910 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.521928 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.624894 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.624927 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.624935 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.624950 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.624962 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.631102 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.631430 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.634451 4721 generic.go:334] "Generic (PLEG): container finished" podID="04b1629d-0184-4975-8d4b-7a32913e7389" containerID="beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89" exitCode=0 Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.634503 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" event={"ID":"04b1629d-0184-4975-8d4b-7a32913e7389","Type":"ContainerDied","Data":"beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.649107 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.704514 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.709832 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.727421 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.727589 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.727676 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.727749 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.727653 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.727817 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.745797 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.773939 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.792206 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.804139 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.820964 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.830656 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.830707 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.830719 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.830736 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.830789 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.836177 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.865733 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.903001 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.935267 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.935305 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.935314 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.935329 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.935340 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:31Z","lastTransitionTime":"2026-02-02T13:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.941725 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:31 crc kubenswrapper[4721]: I0202 13:01:31.986138 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:31Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.034889 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.039035 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.039144 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.039158 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.039181 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.039196 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.062964 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.104739 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.141418 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.141455 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.141466 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.141482 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.141493 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.144130 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.195817 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.224427 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.243497 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.243543 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.243551 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.243569 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.243578 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.264250 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.306215 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.343874 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.346883 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.346962 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.346987 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.347017 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.347043 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.375156 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 07:47:46.445797728 +0000 UTC Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.382465 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.408746 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:32 crc kubenswrapper[4721]: E0202 13:01:32.408950 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.423769 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.450882 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.450942 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.450954 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.450980 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.451006 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.468222 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.511373 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.544750 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.554149 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.554225 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.554236 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.554254 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.554266 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.590401 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.623701 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.642209 4721 generic.go:334] "Generic (PLEG): container finished" podID="04b1629d-0184-4975-8d4b-7a32913e7389" containerID="2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868" exitCode=0 Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.642306 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" event={"ID":"04b1629d-0184-4975-8d4b-7a32913e7389","Type":"ContainerDied","Data":"2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.642369 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.642949 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.656196 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.656252 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.656266 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.656289 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.656369 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.659451 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.686693 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.701839 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.742269 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.758976 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.759030 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.759040 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.759060 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.759094 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.784490 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.824480 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.862793 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.862845 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.862855 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.862875 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.862886 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.868253 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.901198 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.939753 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.965427 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.965461 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.965469 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.965484 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.965493 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:32Z","lastTransitionTime":"2026-02-02T13:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:32 crc kubenswrapper[4721]: I0202 13:01:32.982835 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:32Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.023441 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.063043 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.068175 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.068234 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.068245 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.068260 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.068270 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.101574 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.144606 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.170714 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.170760 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.170772 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.170792 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.170805 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.183543 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.221811 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.270639 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.273681 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.273731 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.273743 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.273765 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.273778 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.303640 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.347597 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.375360 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 01:17:17.151726756 +0000 UTC Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.376399 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.376433 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.376447 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.376471 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.376488 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.383964 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.409277 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.409352 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:33 crc kubenswrapper[4721]: E0202 13:01:33.409462 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:33 crc kubenswrapper[4721]: E0202 13:01:33.409529 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.420949 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.461689 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.478821 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.478872 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.478881 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.478900 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.478912 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.502772 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.542764 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.581929 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.581982 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.581997 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.582017 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.582030 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.586172 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.623589 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.648715 4721 generic.go:334] "Generic (PLEG): container finished" podID="04b1629d-0184-4975-8d4b-7a32913e7389" containerID="5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d" exitCode=0 Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.648790 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" event={"ID":"04b1629d-0184-4975-8d4b-7a32913e7389","Type":"ContainerDied","Data":"5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.648864 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.665973 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.689316 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.689829 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.689839 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.689858 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.689872 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.701360 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.740998 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.792029 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.792085 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.792094 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.792107 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.792116 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.796898 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.825968 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.867413 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.894561 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.894592 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.894601 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.894614 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.894623 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.903311 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.940690 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.985646 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:33Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.997573 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.997769 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.997864 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.997948 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:33 crc kubenswrapper[4721]: I0202 13:01:33.998037 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:33Z","lastTransitionTime":"2026-02-02T13:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.023431 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.086451 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.099003 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.100368 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.100481 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.100549 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.100617 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.100672 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:34Z","lastTransitionTime":"2026-02-02T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.138793 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.188390 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.202980 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.203030 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.203041 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.203058 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.203085 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:34Z","lastTransitionTime":"2026-02-02T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.220655 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.258873 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.299343 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.304685 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.304714 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.304722 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.304736 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.304746 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:34Z","lastTransitionTime":"2026-02-02T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.344459 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.376379 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 04:23:20.408202988 +0000 UTC Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.381786 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.406718 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.406772 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.406788 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.406809 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.406826 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:34Z","lastTransitionTime":"2026-02-02T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.409057 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:34 crc kubenswrapper[4721]: E0202 13:01:34.409177 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.426791 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.465328 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.509037 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.509083 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.509094 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.509110 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.509121 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:34Z","lastTransitionTime":"2026-02-02T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.611098 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.611156 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.611170 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.611190 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.611204 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:34Z","lastTransitionTime":"2026-02-02T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.657142 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" event={"ID":"04b1629d-0184-4975-8d4b-7a32913e7389","Type":"ContainerStarted","Data":"54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.660899 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/0.log" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.674603 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667" exitCode=1 Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.674708 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.675736 4721 scope.go:117] "RemoveContainer" containerID="364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.677878 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.692753 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.706706 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.714709 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.714741 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.714750 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.714762 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.714772 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:34Z","lastTransitionTime":"2026-02-02T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.721161 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.735674 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.746780 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.794229 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.817704 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.817754 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.817768 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.817790 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.817804 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:34Z","lastTransitionTime":"2026-02-02T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.834152 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.852951 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.868312 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.901705 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.919494 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.919523 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.919532 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.919546 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.919558 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:34Z","lastTransitionTime":"2026-02-02T13:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.946157 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.982905 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:34Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:34 crc kubenswrapper[4721]: I0202 13:01:34.984259 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:34 crc kubenswrapper[4721]: E0202 13:01:34.984461 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:01:50.984433639 +0000 UTC m=+51.286948028 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.021978 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.022049 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.022118 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.022159 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.022203 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.022996 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.065677 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.085453 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.085507 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.085533 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.085558 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085635 4721 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085676 4721 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085690 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:51.085672445 +0000 UTC m=+51.388186834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085635 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085733 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085753 4721 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085762 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:51.085750667 +0000 UTC m=+51.388265056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085764 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085832 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085845 4721 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085791 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:51.085778638 +0000 UTC m=+51.388293027 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.085928 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 13:01:51.085910581 +0000 UTC m=+51.388424970 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.121524 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.125703 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.125742 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.125756 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.125784 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.125794 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.142630 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.192177 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.226349 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.227778 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.227835 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.227844 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.227859 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.227868 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.261524 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.303151 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.330085 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.330125 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.330133 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.330147 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.330155 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.344009 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.376776 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 02:32:49.418177546 +0000 UTC Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.382954 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.408846 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.408846 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.409040 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.409110 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.432252 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.432292 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.432303 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.432318 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.432332 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.432745 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"message\\\":\\\"etes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 13:01:34.589960 5998 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 13:01:34.590331 5998 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:34.590798 5998 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:34.590855 5998 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 13:01:34.590894 5998 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 13:01:34.590910 5998 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:34.590937 5998 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 13:01:34.590957 5998 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:34.590979 5998 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:34.591014 5998 factory.go:656] Stopping watch factory\\\\nI0202 13:01:34.591036 5998 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 13:01:34.591054 5998 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:34.591089 5998 ovnkube.go:599] Stopped ovnkube\\\\nI0202 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.464313 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.502301 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.534738 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.534804 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.534823 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.534847 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.534864 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.538362 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.583920 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.623539 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.636869 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.636907 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.636917 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.636933 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.636946 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.662116 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.679846 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/1.log" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.680583 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/0.log" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.683234 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8" exitCode=1 Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.683272 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.683317 4721 scope.go:117] "RemoveContainer" containerID="364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.683955 4721 scope.go:117] "RemoveContainer" containerID="128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8" Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.684133 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.702229 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.740229 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.740283 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.740300 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.740327 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.740344 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.743002 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.800297 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://364a12dc7d10ca52af13d40bb44682bd7d6d8458c365e3e2efc89bdbbfb13667\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"message\\\":\\\"etes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 13:01:34.589960 5998 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 13:01:34.590331 5998 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:34.590798 5998 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:34.590855 5998 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 13:01:34.590894 5998 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 13:01:34.590910 5998 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:34.590937 5998 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 13:01:34.590957 5998 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:34.590979 5998 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:34.591014 5998 factory.go:656] Stopping watch factory\\\\nI0202 13:01:34.591036 5998 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 13:01:34.591054 5998 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:34.591089 5998 ovnkube.go:599] Stopped ovnkube\\\\nI0202 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.822705 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.844003 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.844062 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.844113 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.844138 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.844156 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.866971 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.905801 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.944130 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.946825 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.946855 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.946866 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.946882 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.946902 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.952483 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.952516 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.952529 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.952543 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.952558 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.968120 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.972958 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.973021 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.973038 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.973061 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.973115 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.986116 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: E0202 13:01:35.988749 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:35Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.993094 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.993138 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.993323 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.993352 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:35 crc kubenswrapper[4721]: I0202 13:01:35.993364 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:35Z","lastTransitionTime":"2026-02-02T13:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: E0202 13:01:36.016830 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.021549 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.021610 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.021623 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.021643 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.021656 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.023493 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: E0202 13:01:36.037370 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.041945 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.042007 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.042021 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.042040 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.042054 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: E0202 13:01:36.058047 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: E0202 13:01:36.058269 4721 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.060425 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.060464 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.060476 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.060494 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.060509 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.070630 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.103375 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.142365 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.162692 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.162747 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.162758 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.162779 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.162793 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.182224 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.225573 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.265890 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.265961 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.265986 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.266017 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.266038 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.270324 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.368502 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.368553 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.368575 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.368599 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.368620 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.380161 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 00:17:43.75048225 +0000 UTC Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.424712 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:36 crc kubenswrapper[4721]: E0202 13:01:36.424858 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.471158 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.471221 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.471232 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.471245 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.471255 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.575214 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.575326 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.575349 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.575372 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.575389 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.678954 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.679039 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.679057 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.679125 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.679145 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.690208 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/1.log" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.695346 4721 scope.go:117] "RemoveContainer" containerID="128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8" Feb 02 13:01:36 crc kubenswrapper[4721]: E0202 13:01:36.695505 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.721924 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.739910 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.755254 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.771492 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.781698 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.781763 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.781780 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.781803 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.781818 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.791887 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.810931 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.824624 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.842753 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.861014 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.875681 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.885016 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.885059 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.885085 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.885107 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.885119 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.894455 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.911725 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.931149 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.947585 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.962766 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:36Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.987925 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.988011 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.988037 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.988100 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:36 crc kubenswrapper[4721]: I0202 13:01:36.988124 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:36Z","lastTransitionTime":"2026-02-02T13:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.091765 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.091808 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.091835 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.091851 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.091860 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:37Z","lastTransitionTime":"2026-02-02T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.195247 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.195315 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.195340 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.195372 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.195397 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:37Z","lastTransitionTime":"2026-02-02T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.298352 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.298436 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.298468 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.298501 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.298522 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:37Z","lastTransitionTime":"2026-02-02T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.380853 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 15:32:28.73693421 +0000 UTC Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.401023 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.401130 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.401150 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.401176 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.401201 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:37Z","lastTransitionTime":"2026-02-02T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.409480 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.409480 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:37 crc kubenswrapper[4721]: E0202 13:01:37.409722 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:37 crc kubenswrapper[4721]: E0202 13:01:37.409799 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.504224 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.504267 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.504280 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.504297 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.504310 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:37Z","lastTransitionTime":"2026-02-02T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.607391 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.607453 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.607472 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.607495 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.607564 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:37Z","lastTransitionTime":"2026-02-02T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.709773 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.709816 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.709826 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.709894 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.709908 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:37Z","lastTransitionTime":"2026-02-02T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.813197 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.813251 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.813267 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.813289 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.813329 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:37Z","lastTransitionTime":"2026-02-02T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.917593 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.917629 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.917639 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.917656 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:37 crc kubenswrapper[4721]: I0202 13:01:37.917667 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:37Z","lastTransitionTime":"2026-02-02T13:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.020372 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.020421 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.020430 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.020444 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.020454 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.122843 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.122883 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.122891 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.122908 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.122917 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.226248 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.226299 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.226312 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.226332 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.226346 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.330834 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.330889 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.330903 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.330922 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.330933 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.381399 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 20:41:17.533488669 +0000 UTC Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.409774 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:38 crc kubenswrapper[4721]: E0202 13:01:38.410014 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.433183 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.433226 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.433238 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.433255 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.433267 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.535446 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.535484 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.535504 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.535519 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.535529 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.638165 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.638218 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.638233 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.638253 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.638267 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.741649 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.741706 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.741719 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.741737 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.741749 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.845225 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.845268 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.845276 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.845292 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.845301 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.948723 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.948777 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.948788 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.948808 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:38 crc kubenswrapper[4721]: I0202 13:01:38.948820 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:38Z","lastTransitionTime":"2026-02-02T13:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.009121 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc"] Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.009738 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.011612 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.013718 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.035597 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.051717 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js5c8\" (UniqueName: \"kubernetes.io/projected/ecfc8a9e-993f-494a-ba91-4132345cee05-kube-api-access-js5c8\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.051811 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ecfc8a9e-993f-494a-ba91-4132345cee05-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.051879 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ecfc8a9e-993f-494a-ba91-4132345cee05-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.051909 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ecfc8a9e-993f-494a-ba91-4132345cee05-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.053759 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.053832 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.053846 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.053866 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.053878 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.056280 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.071896 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.086910 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.107399 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.125486 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.144111 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.152464 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js5c8\" (UniqueName: \"kubernetes.io/projected/ecfc8a9e-993f-494a-ba91-4132345cee05-kube-api-access-js5c8\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.152496 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ecfc8a9e-993f-494a-ba91-4132345cee05-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.152528 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ecfc8a9e-993f-494a-ba91-4132345cee05-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.152547 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ecfc8a9e-993f-494a-ba91-4132345cee05-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.153207 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ecfc8a9e-993f-494a-ba91-4132345cee05-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.153610 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ecfc8a9e-993f-494a-ba91-4132345cee05-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.156207 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.156236 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.156246 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.156262 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.156272 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.165377 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.168918 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ecfc8a9e-993f-494a-ba91-4132345cee05-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.176521 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js5c8\" (UniqueName: \"kubernetes.io/projected/ecfc8a9e-993f-494a-ba91-4132345cee05-kube-api-access-js5c8\") pod \"ovnkube-control-plane-749d76644c-sz7tc\" (UID: \"ecfc8a9e-993f-494a-ba91-4132345cee05\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.184154 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.196545 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.215990 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.228692 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.241203 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.255180 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.259489 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.259526 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.259542 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.259563 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.259577 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.268334 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.279445 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:39Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.324826 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" Feb 02 13:01:39 crc kubenswrapper[4721]: W0202 13:01:39.339463 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecfc8a9e_993f_494a_ba91_4132345cee05.slice/crio-64d78dd426cead0bb3cc56e04bb023ff7c0c3ad87f5f83a45c47481343606c56 WatchSource:0}: Error finding container 64d78dd426cead0bb3cc56e04bb023ff7c0c3ad87f5f83a45c47481343606c56: Status 404 returned error can't find the container with id 64d78dd426cead0bb3cc56e04bb023ff7c0c3ad87f5f83a45c47481343606c56 Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.362506 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.362548 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.362561 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.362580 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.362593 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.382457 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 00:43:47.471245013 +0000 UTC Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.409120 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.409153 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:39 crc kubenswrapper[4721]: E0202 13:01:39.409277 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:39 crc kubenswrapper[4721]: E0202 13:01:39.409841 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.465111 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.465156 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.465167 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.465187 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.465199 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.568732 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.568791 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.568810 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.568834 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.568852 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.671894 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.671967 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.671990 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.672021 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.672043 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.709533 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" event={"ID":"ecfc8a9e-993f-494a-ba91-4132345cee05","Type":"ContainerStarted","Data":"02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.709581 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" event={"ID":"ecfc8a9e-993f-494a-ba91-4132345cee05","Type":"ContainerStarted","Data":"64d78dd426cead0bb3cc56e04bb023ff7c0c3ad87f5f83a45c47481343606c56"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.774315 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.774364 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.774378 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.774400 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.774415 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.877366 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.877410 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.877424 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.877440 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.877452 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.980201 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.980258 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.980270 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.980296 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:39 crc kubenswrapper[4721]: I0202 13:01:39.980309 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:39Z","lastTransitionTime":"2026-02-02T13:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.083044 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.083122 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.083142 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.083169 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.083186 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:40Z","lastTransitionTime":"2026-02-02T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.105557 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xqz79"] Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.106200 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:40 crc kubenswrapper[4721]: E0202 13:01:40.106280 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.124497 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.142442 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.158380 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.162217 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbwws\" (UniqueName: \"kubernetes.io/projected/bfab3ffb-8798-423d-9b55-83868b76a14e-kube-api-access-vbwws\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.162286 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.175989 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.186826 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.186859 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.186868 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.186882 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.186892 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:40Z","lastTransitionTime":"2026-02-02T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.196241 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.213868 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.230456 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.252267 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.263728 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbwws\" (UniqueName: \"kubernetes.io/projected/bfab3ffb-8798-423d-9b55-83868b76a14e-kube-api-access-vbwws\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.263774 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:40 crc kubenswrapper[4721]: E0202 13:01:40.263895 4721 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:40 crc kubenswrapper[4721]: E0202 13:01:40.263944 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs podName:bfab3ffb-8798-423d-9b55-83868b76a14e nodeName:}" failed. No retries permitted until 2026-02-02 13:01:40.763929582 +0000 UTC m=+41.066443981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs") pod "network-metrics-daemon-xqz79" (UID: "bfab3ffb-8798-423d-9b55-83868b76a14e") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.266508 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.281607 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbwws\" (UniqueName: \"kubernetes.io/projected/bfab3ffb-8798-423d-9b55-83868b76a14e-kube-api-access-vbwws\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.281990 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.289165 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.289205 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.289214 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.289230 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.289241 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:40Z","lastTransitionTime":"2026-02-02T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.298896 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.311251 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.324974 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.355842 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.378405 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.382686 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 03:51:52.158317144 +0000 UTC Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.391768 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.391801 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.391809 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.391823 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.391832 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:40Z","lastTransitionTime":"2026-02-02T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.398112 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.409364 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:40 crc kubenswrapper[4721]: E0202 13:01:40.409501 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.416199 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.430833 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.443562 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.454551 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.465539 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.477808 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.491783 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.493756 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.493803 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.493820 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.493843 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.493861 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:40Z","lastTransitionTime":"2026-02-02T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.505636 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.518091 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.537963 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.557690 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.577056 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.591706 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.595841 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.595878 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.595887 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.595903 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.595914 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:40Z","lastTransitionTime":"2026-02-02T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.607942 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.628121 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.643763 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.659408 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.680301 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.699135 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.699201 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.699212 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.699232 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.699246 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:40Z","lastTransitionTime":"2026-02-02T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.714851 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" event={"ID":"ecfc8a9e-993f-494a-ba91-4132345cee05","Type":"ContainerStarted","Data":"b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.735376 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.749730 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.759487 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.770855 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.772390 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:40 crc kubenswrapper[4721]: E0202 13:01:40.772585 4721 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:40 crc kubenswrapper[4721]: E0202 13:01:40.772654 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs podName:bfab3ffb-8798-423d-9b55-83868b76a14e nodeName:}" failed. No retries permitted until 2026-02-02 13:01:41.772636461 +0000 UTC m=+42.075150860 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs") pod "network-metrics-daemon-xqz79" (UID: "bfab3ffb-8798-423d-9b55-83868b76a14e") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.791088 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.801636 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.801721 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.801765 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.802466 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.802517 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:40Z","lastTransitionTime":"2026-02-02T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.810935 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.830597 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.847526 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.869724 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.892513 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.906228 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.906300 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.906318 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.906345 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.906362 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:40Z","lastTransitionTime":"2026-02-02T13:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.907517 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.928055 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.940812 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.953534 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.965453 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.977575 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:40 crc kubenswrapper[4721]: I0202 13:01:40.990210 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:40Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.009332 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.009400 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.009414 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.009433 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.009445 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.113211 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.113287 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.113307 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.113341 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.113361 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.216775 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.216847 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.216865 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.216894 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.216914 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.319909 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.319959 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.319971 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.319989 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.320002 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.383160 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 08:00:10.033090849 +0000 UTC Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.409612 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.409680 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:41 crc kubenswrapper[4721]: E0202 13:01:41.409754 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.409613 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:41 crc kubenswrapper[4721]: E0202 13:01:41.409889 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:41 crc kubenswrapper[4721]: E0202 13:01:41.410119 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.423234 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.423278 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.423294 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.423317 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.423335 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.526907 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.526968 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.526989 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.527019 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.527043 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.630250 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.630325 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.630340 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.630362 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.630379 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.752369 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.752496 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.752518 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.752542 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.752559 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.785042 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:41 crc kubenswrapper[4721]: E0202 13:01:41.785425 4721 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:41 crc kubenswrapper[4721]: E0202 13:01:41.785530 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs podName:bfab3ffb-8798-423d-9b55-83868b76a14e nodeName:}" failed. No retries permitted until 2026-02-02 13:01:43.785507468 +0000 UTC m=+44.088021857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs") pod "network-metrics-daemon-xqz79" (UID: "bfab3ffb-8798-423d-9b55-83868b76a14e") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.855952 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.856041 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.856105 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.856140 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.856164 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.959216 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.959261 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.959274 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.959290 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:41 crc kubenswrapper[4721]: I0202 13:01:41.959300 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:41Z","lastTransitionTime":"2026-02-02T13:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.061549 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.061620 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.061632 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.061678 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.061693 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.164735 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.164786 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.164797 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.164813 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.164822 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.268215 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.268276 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.268288 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.268308 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.268330 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.370990 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.371041 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.371053 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.371097 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.371112 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.383374 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 23:34:48.641660503 +0000 UTC Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.409391 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:42 crc kubenswrapper[4721]: E0202 13:01:42.409628 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.474706 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.474783 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.474807 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.474831 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.474852 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.578633 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.578702 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.578726 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.578758 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.578782 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.685772 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.685865 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.685891 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.685923 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.685947 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.789526 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.789572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.789583 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.789601 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.789612 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.892974 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.893036 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.893052 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.893106 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.893125 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.996106 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.996143 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.996154 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.996169 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:42 crc kubenswrapper[4721]: I0202 13:01:42.996179 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:42Z","lastTransitionTime":"2026-02-02T13:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.098939 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.098975 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.098988 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.099004 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.099017 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:43Z","lastTransitionTime":"2026-02-02T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.202013 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.202059 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.202098 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.202124 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.202138 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:43Z","lastTransitionTime":"2026-02-02T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.307571 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.307613 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.307624 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.307643 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.307658 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:43Z","lastTransitionTime":"2026-02-02T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.383536 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 16:33:17.06348755 +0000 UTC Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.408626 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:43 crc kubenswrapper[4721]: E0202 13:01:43.408765 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.408956 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.408626 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:43 crc kubenswrapper[4721]: E0202 13:01:43.409259 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:43 crc kubenswrapper[4721]: E0202 13:01:43.409436 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.410720 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.410761 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.410773 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.410787 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.410799 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:43Z","lastTransitionTime":"2026-02-02T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.514212 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.514259 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.514271 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.514288 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.514301 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:43Z","lastTransitionTime":"2026-02-02T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.617570 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.617621 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.617633 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.617651 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.617665 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:43Z","lastTransitionTime":"2026-02-02T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.721205 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.721287 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.721301 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.721329 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.721347 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:43Z","lastTransitionTime":"2026-02-02T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.806568 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:43 crc kubenswrapper[4721]: E0202 13:01:43.806887 4721 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:43 crc kubenswrapper[4721]: E0202 13:01:43.806993 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs podName:bfab3ffb-8798-423d-9b55-83868b76a14e nodeName:}" failed. No retries permitted until 2026-02-02 13:01:47.806964847 +0000 UTC m=+48.109479276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs") pod "network-metrics-daemon-xqz79" (UID: "bfab3ffb-8798-423d-9b55-83868b76a14e") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.825230 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.825655 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.825836 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.826058 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.826279 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:43Z","lastTransitionTime":"2026-02-02T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.929955 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.930019 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.930034 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.930057 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:43 crc kubenswrapper[4721]: I0202 13:01:43.930102 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:43Z","lastTransitionTime":"2026-02-02T13:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.032793 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.032853 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.032868 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.032890 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.032909 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.135856 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.135919 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.135936 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.135960 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.135977 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.239780 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.239865 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.239894 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.239928 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.239955 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.342666 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.342754 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.342779 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.342815 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.342842 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.384639 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 14:12:16.247949756 +0000 UTC Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.409056 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:44 crc kubenswrapper[4721]: E0202 13:01:44.409312 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.445195 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.445244 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.445253 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.445266 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.445278 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.547550 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.547604 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.547621 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.547644 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.547660 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.650192 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.650223 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.650231 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.650245 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.650256 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.753579 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.753632 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.753650 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.753674 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.753692 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.856799 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.856846 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.856855 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.856872 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.856883 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.960193 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.960232 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.960242 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.960256 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:44 crc kubenswrapper[4721]: I0202 13:01:44.960265 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:44Z","lastTransitionTime":"2026-02-02T13:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.068754 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.068824 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.068836 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.068858 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.068872 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:45Z","lastTransitionTime":"2026-02-02T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.171668 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.171725 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.171735 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.171747 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.171757 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:45Z","lastTransitionTime":"2026-02-02T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.275507 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.275570 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.275606 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.275638 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.275661 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:45Z","lastTransitionTime":"2026-02-02T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.382809 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.382900 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.382924 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.382953 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.382976 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:45Z","lastTransitionTime":"2026-02-02T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.385576 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 08:16:31.892759916 +0000 UTC Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.409053 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.409165 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.409126 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:45 crc kubenswrapper[4721]: E0202 13:01:45.409261 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:45 crc kubenswrapper[4721]: E0202 13:01:45.409397 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:45 crc kubenswrapper[4721]: E0202 13:01:45.409661 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.486665 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.486729 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.486788 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.486821 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.486844 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:45Z","lastTransitionTime":"2026-02-02T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.590637 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.590706 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.590724 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.590750 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.590767 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:45Z","lastTransitionTime":"2026-02-02T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.695033 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.695178 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.695205 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.695236 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.695260 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:45Z","lastTransitionTime":"2026-02-02T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.798045 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.798149 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.798188 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.798225 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.798247 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:45Z","lastTransitionTime":"2026-02-02T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.901635 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.901687 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.901698 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.901723 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:45 crc kubenswrapper[4721]: I0202 13:01:45.901736 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:45Z","lastTransitionTime":"2026-02-02T13:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.004462 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.004517 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.004529 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.004550 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.004566 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.108525 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.108590 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.108603 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.108628 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.108642 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.124049 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.124123 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.124139 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.124164 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.124182 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: E0202 13:01:46.148456 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:46Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.154173 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.154218 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.154233 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.154253 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.154266 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: E0202 13:01:46.172953 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:46Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.177479 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.177524 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.177538 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.177565 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.177585 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: E0202 13:01:46.197180 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:46Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.202194 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.202343 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.202419 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.202486 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.202544 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: E0202 13:01:46.218500 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:46Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.223765 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.223811 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.223829 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.223851 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.223863 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: E0202 13:01:46.240194 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:46Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:46 crc kubenswrapper[4721]: E0202 13:01:46.240360 4721 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.242431 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.242470 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.242482 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.242500 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.242512 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.345779 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.345840 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.345850 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.345878 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.345891 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.385992 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 16:19:41.552084555 +0000 UTC Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.409657 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:46 crc kubenswrapper[4721]: E0202 13:01:46.409792 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.449209 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.449264 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.449279 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.449298 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.449314 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.552323 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.552358 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.552367 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.552379 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.552388 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.655464 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.655525 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.655550 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.655579 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.655602 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.759484 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.759538 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.759551 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.759572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.759584 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.862949 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.863061 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.863114 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.863144 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.863160 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.966564 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.966843 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.966870 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.966902 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:46 crc kubenswrapper[4721]: I0202 13:01:46.966926 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:46Z","lastTransitionTime":"2026-02-02T13:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.069989 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.070041 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.070052 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.070084 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.070098 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:47Z","lastTransitionTime":"2026-02-02T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.173623 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.173696 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.173714 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.173742 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.173763 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:47Z","lastTransitionTime":"2026-02-02T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.277500 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.277546 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.277561 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.277583 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.277598 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:47Z","lastTransitionTime":"2026-02-02T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.381254 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.381316 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.381336 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.381360 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.381379 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:47Z","lastTransitionTime":"2026-02-02T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.387055 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 17:02:16.671589262 +0000 UTC Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.409470 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.409805 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:47 crc kubenswrapper[4721]: E0202 13:01:47.409890 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.409978 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:47 crc kubenswrapper[4721]: E0202 13:01:47.410280 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:47 crc kubenswrapper[4721]: E0202 13:01:47.410425 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.484580 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.484617 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.484630 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.484683 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.484699 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:47Z","lastTransitionTime":"2026-02-02T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.587695 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.587756 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.587768 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.587787 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.587800 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:47Z","lastTransitionTime":"2026-02-02T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.690738 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.691142 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.691235 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.691336 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.691402 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:47Z","lastTransitionTime":"2026-02-02T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.795611 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.795691 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.795705 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.795730 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.795745 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:47Z","lastTransitionTime":"2026-02-02T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.857818 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:47 crc kubenswrapper[4721]: E0202 13:01:47.858127 4721 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:47 crc kubenswrapper[4721]: E0202 13:01:47.858264 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs podName:bfab3ffb-8798-423d-9b55-83868b76a14e nodeName:}" failed. No retries permitted until 2026-02-02 13:01:55.85823191 +0000 UTC m=+56.160746339 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs") pod "network-metrics-daemon-xqz79" (UID: "bfab3ffb-8798-423d-9b55-83868b76a14e") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.899457 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.899515 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.899537 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.899560 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:47 crc kubenswrapper[4721]: I0202 13:01:47.899572 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:47Z","lastTransitionTime":"2026-02-02T13:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.003810 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.003903 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.003916 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.003941 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.003958 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.107514 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.107582 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.107605 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.107635 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.107657 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.210921 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.210975 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.210988 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.211009 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.211029 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.314412 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.314453 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.314463 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.314483 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.314497 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.387697 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:06:55.448701676 +0000 UTC Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.409282 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:48 crc kubenswrapper[4721]: E0202 13:01:48.409994 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.416844 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.416896 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.416911 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.416930 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.416946 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.519884 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.519941 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.519956 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.519977 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.519991 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.622117 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.622187 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.622209 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.622234 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.622252 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.725207 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.725245 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.725258 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.725279 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.725290 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.829374 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.829445 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.829463 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.829483 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.829495 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.932509 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.932572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.932589 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.932690 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:48 crc kubenswrapper[4721]: I0202 13:01:48.932710 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:48Z","lastTransitionTime":"2026-02-02T13:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.036193 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.036257 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.036279 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.036308 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.036329 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.139462 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.139516 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.139534 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.139559 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.139580 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.242610 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.242652 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.242662 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.242679 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.242689 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.346602 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.347112 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.347316 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.347481 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.347624 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.388380 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 12:09:55.618039191 +0000 UTC Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.408769 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.408869 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.408915 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:49 crc kubenswrapper[4721]: E0202 13:01:49.409673 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:49 crc kubenswrapper[4721]: E0202 13:01:49.409820 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:49 crc kubenswrapper[4721]: E0202 13:01:49.409979 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.451306 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.451381 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.451420 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.451453 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.451473 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.555351 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.555399 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.555408 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.555423 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.555432 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.658660 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.659176 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.659435 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.659603 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.659757 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.763289 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.763403 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.763470 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.763499 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.763521 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.866559 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.866856 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.867019 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.867257 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.867392 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.970479 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.970753 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.970993 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.971165 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:49 crc kubenswrapper[4721]: I0202 13:01:49.971385 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:49Z","lastTransitionTime":"2026-02-02T13:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.074410 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.074787 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.074918 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.075096 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.075250 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:50Z","lastTransitionTime":"2026-02-02T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.178363 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.178405 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.178418 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.178432 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.178441 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:50Z","lastTransitionTime":"2026-02-02T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.280636 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.281042 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.281324 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.281495 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.281631 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:50Z","lastTransitionTime":"2026-02-02T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.384835 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.384885 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.384893 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.384907 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.384917 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:50Z","lastTransitionTime":"2026-02-02T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.389131 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 16:38:32.19320661 +0000 UTC Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.409729 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:50 crc kubenswrapper[4721]: E0202 13:01:50.409937 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.432683 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.451986 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.471564 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.487452 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.487517 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.487541 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.487572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.487595 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:50Z","lastTransitionTime":"2026-02-02T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.492679 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.517308 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.538863 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.575433 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.591124 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.591173 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.591185 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.591201 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.591213 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:50Z","lastTransitionTime":"2026-02-02T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.596937 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.615871 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.628190 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.650793 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.666085 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.681305 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.693696 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.693734 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.693745 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.693761 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.693773 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:50Z","lastTransitionTime":"2026-02-02T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.696925 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.709829 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.730003 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.746445 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:50Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.795673 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.795773 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.795794 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.795819 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.795836 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:50Z","lastTransitionTime":"2026-02-02T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.899054 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.899140 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.899158 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.899182 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.899202 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:50Z","lastTransitionTime":"2026-02-02T13:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:50 crc kubenswrapper[4721]: I0202 13:01:50.996202 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:01:50 crc kubenswrapper[4721]: E0202 13:01:50.996609 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:02:22.996568758 +0000 UTC m=+83.299083187 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.002528 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.003138 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.003372 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.003574 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.003897 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.097503 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.097946 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.098418 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.098635 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.097657 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.098943 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.099117 4721 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.098053 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.098534 4721 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.099487 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.099556 4721 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.098738 4721 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.099330 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 13:02:23.099312724 +0000 UTC m=+83.401827133 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.099790 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 13:02:23.099745435 +0000 UTC m=+83.402259864 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.099832 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:02:23.099814907 +0000 UTC m=+83.402329326 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.100322 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:02:23.100262469 +0000 UTC m=+83.402776898 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.108451 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.108509 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.108529 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.108558 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.108578 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.211843 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.211898 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.211915 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.211938 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.211954 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.314652 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.314687 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.314695 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.314712 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.314722 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.389825 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 16:13:13.507359756 +0000 UTC Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.408843 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.409237 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.409268 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.409369 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.409568 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.409820 4721 scope.go:117] "RemoveContainer" containerID="128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8" Feb 02 13:01:51 crc kubenswrapper[4721]: E0202 13:01:51.409829 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.417707 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.417764 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.417774 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.417794 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.417807 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.520414 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.520889 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.520939 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.520958 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.520971 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.623120 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.623165 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.623178 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.623193 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.623202 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.725945 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.726007 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.726026 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.726050 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.726098 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.790500 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/1.log" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.793345 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.793505 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.809632 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.827680 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.829468 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.829515 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.829527 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.829548 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.829564 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.848334 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.866085 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.888583 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.913030 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.927219 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.932442 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.932487 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.932497 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.932518 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.932534 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:51Z","lastTransitionTime":"2026-02-02T13:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.954935 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.970593 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:51 crc kubenswrapper[4721]: I0202 13:01:51.986745 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.000813 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:51Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.011035 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.024122 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.035494 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.035545 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.035557 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.035578 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.035593 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.044733 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.056944 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.068443 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.078947 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.138013 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.138056 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.138080 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.138103 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.138146 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.241417 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.241479 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.241503 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.241526 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.241543 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.343712 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.343747 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.343755 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.343771 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.343782 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.390258 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 12:16:26.868556933 +0000 UTC Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.408775 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:52 crc kubenswrapper[4721]: E0202 13:01:52.408900 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.447243 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.447314 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.447330 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.447353 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.447372 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.482674 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.550693 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.550727 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.550735 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.550748 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.550758 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.653737 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.653797 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.653807 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.653827 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.653841 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.756394 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.756468 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.756480 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.756503 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.756527 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.799360 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/2.log" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.800364 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/1.log" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.803982 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1" exitCode=1 Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.804039 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.804114 4721 scope.go:117] "RemoveContainer" containerID="128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.805524 4721 scope.go:117] "RemoveContainer" containerID="6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1" Feb 02 13:01:52 crc kubenswrapper[4721]: E0202 13:01:52.805851 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.827331 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.845243 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.858608 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.858639 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.858651 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.858668 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.858684 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.876466 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.896856 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.916227 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.939973 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.960805 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.962103 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.962238 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.962325 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.962404 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.962499 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:52Z","lastTransitionTime":"2026-02-02T13:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:52 crc kubenswrapper[4721]: I0202 13:01:52.978280 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:52Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.011855 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.031129 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.048756 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.064967 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.065774 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.065823 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.065835 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.065856 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.065871 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.088218 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.105489 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.122321 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.136427 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.151289 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.168198 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.168265 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.168288 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.168320 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.168342 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.272212 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.272273 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.272290 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.272314 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.272331 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.375645 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.375713 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.375731 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.375758 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.375782 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.390497 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 23:20:25.116386108 +0000 UTC Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.409041 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.409171 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.409057 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:53 crc kubenswrapper[4721]: E0202 13:01:53.409314 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:53 crc kubenswrapper[4721]: E0202 13:01:53.409482 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:53 crc kubenswrapper[4721]: E0202 13:01:53.409761 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.478202 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.479152 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.479213 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.479224 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.479243 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.479254 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.499722 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.510558 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.527652 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.548752 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.568032 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.581422 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.581461 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.581471 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.581490 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.581501 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.584903 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.600582 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.614113 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.647127 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://128d117e2ed6ed05a65511943fe20f8f9d5cd3f62044941596a47d3b7a03eea8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:35Z\\\",\\\"message\\\":\\\"ace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.608892 6180 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 13:01:35.609124 6180 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 13:01:35.609151 6180 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 13:01:35.609186 6180 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 13:01:35.609190 6180 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 13:01:35.609206 6180 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 13:01:35.609217 6180 factory.go:656] Stopping watch factory\\\\nI0202 13:01:35.609232 6180 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 13:01:35.609240 6180 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 13:01:35.609246 6180 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 13:01:35.609251 6180 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 13:01:35.609263 6180 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 13:01:35.609269 6180 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 13:01:35.608240 6180 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.667402 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.685169 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.685235 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.685253 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.685279 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.685303 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.690093 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.708481 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.724238 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.741911 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.776643 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.787535 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.787593 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.787610 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.787635 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.787654 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.795442 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.811340 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/2.log" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.817717 4721 scope.go:117] "RemoveContainer" containerID="6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1" Feb 02 13:01:53 crc kubenswrapper[4721]: E0202 13:01:53.818064 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.820598 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.845490 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.882459 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.889997 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.890056 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.890095 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.890122 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.890141 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.902987 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.916022 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.927901 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.942281 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.963661 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.980155 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.993054 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:53Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.993537 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.993608 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.993621 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.993645 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:53 crc kubenswrapper[4721]: I0202 13:01:53.993660 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:53Z","lastTransitionTime":"2026-02-02T13:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.007058 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.024120 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.043736 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.062392 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.077124 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.096903 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.096968 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.096987 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.097014 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.097034 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:54Z","lastTransitionTime":"2026-02-02T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.097629 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.115929 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e7b999-d47e-409d-bafe-ffde8c03995e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3b10d453f884ce1095dd9bb0f8d91ebd466c4e2771a3a425abbe3001084cb09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da1bc82fcd0af4a3c1e3508fe525c95017c03f521388a28e7eaad2bb5dbb0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8797e4a71975106b9aa6c6519ef0d64047df3b21e3164f586b1441f5897e0e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.127994 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.139583 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.156789 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:54Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.200510 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.200564 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.200580 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.200603 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.200620 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:54Z","lastTransitionTime":"2026-02-02T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.303418 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.303470 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.303487 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.303509 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.303523 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:54Z","lastTransitionTime":"2026-02-02T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.390840 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 17:40:37.637632854 +0000 UTC Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.406734 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.406770 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.406784 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.406802 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.406818 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:54Z","lastTransitionTime":"2026-02-02T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.409293 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:54 crc kubenswrapper[4721]: E0202 13:01:54.409453 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.510059 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.510154 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.510194 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.510226 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.510249 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:54Z","lastTransitionTime":"2026-02-02T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.612973 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.613033 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.613056 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.613114 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.613140 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:54Z","lastTransitionTime":"2026-02-02T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.716557 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.716830 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.716853 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.716885 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.716912 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:54Z","lastTransitionTime":"2026-02-02T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.819469 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.819531 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.819542 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.819565 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.819577 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:54Z","lastTransitionTime":"2026-02-02T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.922299 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.922639 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.922711 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.922844 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:54 crc kubenswrapper[4721]: I0202 13:01:54.922931 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:54Z","lastTransitionTime":"2026-02-02T13:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.025994 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.026034 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.026046 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.026154 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.026171 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:55Z","lastTransitionTime":"2026-02-02T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.130425 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.130487 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.130499 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.130522 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.130537 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:55Z","lastTransitionTime":"2026-02-02T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.234461 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.234529 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.234544 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.234571 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.234589 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:55Z","lastTransitionTime":"2026-02-02T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.337957 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.338028 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.338046 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.338111 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.338135 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:55Z","lastTransitionTime":"2026-02-02T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.391412 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 12:39:40.4444822 +0000 UTC Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.409774 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.409836 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.409910 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:55 crc kubenswrapper[4721]: E0202 13:01:55.410055 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:55 crc kubenswrapper[4721]: E0202 13:01:55.410224 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:55 crc kubenswrapper[4721]: E0202 13:01:55.410336 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.441385 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.441453 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.441503 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.441531 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.441554 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:55Z","lastTransitionTime":"2026-02-02T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.545137 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.545224 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.545237 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.545254 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.545269 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:55Z","lastTransitionTime":"2026-02-02T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.648647 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.648721 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.648742 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.648772 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.648794 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:55Z","lastTransitionTime":"2026-02-02T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.751770 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.751872 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.751886 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.751903 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.751918 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:55Z","lastTransitionTime":"2026-02-02T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.854540 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.854924 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.854948 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.854977 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.855002 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:55Z","lastTransitionTime":"2026-02-02T13:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:55 crc kubenswrapper[4721]: I0202 13:01:55.859510 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:55 crc kubenswrapper[4721]: E0202 13:01:55.859758 4721 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:55 crc kubenswrapper[4721]: E0202 13:01:55.859848 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs podName:bfab3ffb-8798-423d-9b55-83868b76a14e nodeName:}" failed. No retries permitted until 2026-02-02 13:02:11.859823706 +0000 UTC m=+72.162338125 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs") pod "network-metrics-daemon-xqz79" (UID: "bfab3ffb-8798-423d-9b55-83868b76a14e") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.356478 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.356539 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.356552 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.356570 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.356582 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.358442 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.358586 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.358648 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.358730 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.358800 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: E0202 13:01:56.373590 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:56Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.378321 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.378365 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.378375 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.378392 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.378405 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: E0202 13:01:56.391469 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:56Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.391734 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 05:31:08.632419738 +0000 UTC Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.395881 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.395930 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.395942 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.395962 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.395975 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.409488 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:56 crc kubenswrapper[4721]: E0202 13:01:56.409596 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:56 crc kubenswrapper[4721]: E0202 13:01:56.411548 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:56Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.416940 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.416998 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.417013 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.417032 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.417044 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: E0202 13:01:56.432183 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:56Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.436792 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.436835 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.436847 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.436863 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.436874 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: E0202 13:01:56.451164 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:01:56Z is after 2025-08-24T17:21:41Z" Feb 02 13:01:56 crc kubenswrapper[4721]: E0202 13:01:56.451280 4721 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.458759 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.458800 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.458812 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.458828 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.458841 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.561177 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.561237 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.561285 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.561319 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.561342 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.664005 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.664043 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.664056 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.664092 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.664104 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.767764 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.767815 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.767831 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.767851 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.767865 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.870228 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.870287 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.870310 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.870340 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.870364 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.973773 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.973851 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.973876 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.973908 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:56 crc kubenswrapper[4721]: I0202 13:01:56.973931 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:56Z","lastTransitionTime":"2026-02-02T13:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.076802 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.076833 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.076842 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.076856 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.076866 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:57Z","lastTransitionTime":"2026-02-02T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.179619 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.179676 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.179693 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.179715 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.179732 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:57Z","lastTransitionTime":"2026-02-02T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.283340 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.283391 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.283408 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.283432 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.283449 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:57Z","lastTransitionTime":"2026-02-02T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.386709 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.386994 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.387056 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.387141 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.387197 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:57Z","lastTransitionTime":"2026-02-02T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.391915 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 13:39:01.064491913 +0000 UTC Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.409462 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:57 crc kubenswrapper[4721]: E0202 13:01:57.409613 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.409468 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.409803 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:57 crc kubenswrapper[4721]: E0202 13:01:57.409988 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:57 crc kubenswrapper[4721]: E0202 13:01:57.409821 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.489684 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.489720 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.489739 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.489754 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.489766 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:57Z","lastTransitionTime":"2026-02-02T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.593154 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.593226 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.593252 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.593282 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.593306 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:57Z","lastTransitionTime":"2026-02-02T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.696551 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.696605 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.696621 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.696642 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.696657 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:57Z","lastTransitionTime":"2026-02-02T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.800441 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.800516 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.800535 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.800558 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.800570 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:57Z","lastTransitionTime":"2026-02-02T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.904108 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.904172 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.904189 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.904215 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:57 crc kubenswrapper[4721]: I0202 13:01:57.904228 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:57Z","lastTransitionTime":"2026-02-02T13:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.007763 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.007802 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.007812 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.007829 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.007841 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.110682 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.110730 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.110741 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.110761 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.110772 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.213562 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.213764 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.213793 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.213830 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.213857 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.315966 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.316009 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.316020 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.316036 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.316047 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.392303 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 14:25:46.255348908 +0000 UTC Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.409560 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:01:58 crc kubenswrapper[4721]: E0202 13:01:58.409786 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.419025 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.419085 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.419095 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.419107 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.419117 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.522004 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.522051 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.522096 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.522118 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.522133 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.625595 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.625656 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.625667 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.625684 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.625697 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.729434 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.729476 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.729487 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.729509 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.729524 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.833283 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.833367 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.833386 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.833413 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.833430 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.936936 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.937013 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.937040 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.937104 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:58 crc kubenswrapper[4721]: I0202 13:01:58.937131 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:58Z","lastTransitionTime":"2026-02-02T13:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.040760 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.040809 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.040821 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.040839 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.040851 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.143528 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.143591 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.143609 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.143630 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.143644 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.247274 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.247349 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.247368 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.247393 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.247410 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.350354 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.350400 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.350411 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.350425 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.350437 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.392794 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 08:35:01.208020168 +0000 UTC Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.409456 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.409487 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.409485 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:01:59 crc kubenswrapper[4721]: E0202 13:01:59.409564 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:01:59 crc kubenswrapper[4721]: E0202 13:01:59.409650 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:01:59 crc kubenswrapper[4721]: E0202 13:01:59.409742 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.453846 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.453904 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.453919 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.453941 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.453960 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.558158 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.558256 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.558286 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.558313 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.558337 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.661583 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.662003 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.662296 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.662441 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.662598 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.767521 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.767889 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.768033 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.768178 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.768284 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.871660 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.871735 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.871753 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.871777 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.871794 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.975477 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.975526 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.975539 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.975560 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:01:59 crc kubenswrapper[4721]: I0202 13:01:59.975572 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:01:59Z","lastTransitionTime":"2026-02-02T13:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.078097 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.078148 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.078159 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.078174 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.078184 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:00Z","lastTransitionTime":"2026-02-02T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.181360 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.181417 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.181430 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.181452 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.181468 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:00Z","lastTransitionTime":"2026-02-02T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.284628 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.284682 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.284691 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.284710 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.284725 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:00Z","lastTransitionTime":"2026-02-02T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.387440 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.387515 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.387527 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.387546 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.387559 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:00Z","lastTransitionTime":"2026-02-02T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.393854 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 03:46:49.843847152 +0000 UTC Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.409528 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:00 crc kubenswrapper[4721]: E0202 13:02:00.409766 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.427627 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.442162 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.461455 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.477232 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.491677 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.491729 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.491746 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.491767 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.491782 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:00Z","lastTransitionTime":"2026-02-02T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.493462 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.513235 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.524894 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e7b999-d47e-409d-bafe-ffde8c03995e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3b10d453f884ce1095dd9bb0f8d91ebd466c4e2771a3a425abbe3001084cb09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da1bc82fcd0af4a3c1e3508fe525c95017c03f521388a28e7eaad2bb5dbb0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8797e4a71975106b9aa6c6519ef0d64047df3b21e3164f586b1441f5897e0e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.541853 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.553565 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.562822 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.574203 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.590566 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.595455 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.595498 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.595510 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.595531 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.595545 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:00Z","lastTransitionTime":"2026-02-02T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.609582 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.626950 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.644189 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.676008 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.694104 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.698096 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.698170 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.698190 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.698215 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.698232 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:00Z","lastTransitionTime":"2026-02-02T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.709920 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:00Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.800973 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.801055 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.801122 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.801169 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.801201 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:00Z","lastTransitionTime":"2026-02-02T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.904511 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.904574 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.904585 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.904606 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:00 crc kubenswrapper[4721]: I0202 13:02:00.904619 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:00Z","lastTransitionTime":"2026-02-02T13:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.007390 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.007462 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.007481 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.007507 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.007528 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.110522 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.110600 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.110616 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.110651 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.110666 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.213044 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.213115 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.213128 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.213144 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.213155 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.315726 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.315772 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.315781 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.315799 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.315811 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.394229 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 18:44:38.044251768 +0000 UTC Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.409773 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.409827 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.409958 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:01 crc kubenswrapper[4721]: E0202 13:02:01.410113 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:01 crc kubenswrapper[4721]: E0202 13:02:01.410231 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:01 crc kubenswrapper[4721]: E0202 13:02:01.410407 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.418589 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.418644 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.418660 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.418682 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.419193 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.523241 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.523289 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.523301 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.523318 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.523329 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.626960 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.627016 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.627037 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.627102 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.627125 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.730961 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.731033 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.731051 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.731114 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.731157 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.834659 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.834714 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.834727 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.834743 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.834753 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.937888 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.937935 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.937947 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.937965 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:01 crc kubenswrapper[4721]: I0202 13:02:01.937978 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:01Z","lastTransitionTime":"2026-02-02T13:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.040479 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.040522 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.040532 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.040558 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.040570 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.143697 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.143756 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.143770 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.143790 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.143801 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.246873 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.246939 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.246954 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.246976 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.246993 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.350375 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.350440 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.350468 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.350504 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.350529 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.394430 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 17:04:27.916590308 +0000 UTC Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.409220 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:02 crc kubenswrapper[4721]: E0202 13:02:02.409525 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.454591 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.454648 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.454673 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.454705 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.454729 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.558494 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.558531 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.558540 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.558557 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.558568 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.663388 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.663447 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.663463 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.663492 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.663512 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.766655 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.766716 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.766725 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.766743 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.766778 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.869776 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.869815 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.869831 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.869852 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.869864 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.973003 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.973041 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.973051 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.973093 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:02 crc kubenswrapper[4721]: I0202 13:02:02.973107 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:02Z","lastTransitionTime":"2026-02-02T13:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.076096 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.076131 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.076142 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.076165 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.076175 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:03Z","lastTransitionTime":"2026-02-02T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.179182 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.179218 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.179227 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.179244 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.179264 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:03Z","lastTransitionTime":"2026-02-02T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.282133 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.282167 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.282175 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.282191 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.282202 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:03Z","lastTransitionTime":"2026-02-02T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.385464 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.385503 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.385514 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.385535 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.385548 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:03Z","lastTransitionTime":"2026-02-02T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.395332 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 12:43:02.190844105 +0000 UTC Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.409398 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:03 crc kubenswrapper[4721]: E0202 13:02:03.409670 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.410028 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:03 crc kubenswrapper[4721]: E0202 13:02:03.410229 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.410348 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:03 crc kubenswrapper[4721]: E0202 13:02:03.410468 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.488492 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.488618 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.488633 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.488651 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.488663 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:03Z","lastTransitionTime":"2026-02-02T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.592681 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.592731 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.592749 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.592772 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.592789 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:03Z","lastTransitionTime":"2026-02-02T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.697818 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.697903 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.697961 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.697985 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.698002 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:03Z","lastTransitionTime":"2026-02-02T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.800806 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.800838 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.800847 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.800861 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.800872 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:03Z","lastTransitionTime":"2026-02-02T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.904095 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.904150 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.904163 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.904187 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:03 crc kubenswrapper[4721]: I0202 13:02:03.904201 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:03Z","lastTransitionTime":"2026-02-02T13:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.007573 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.007635 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.007644 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.007661 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.007671 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.110205 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.110274 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.110286 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.110310 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.110325 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.213376 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.213436 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.213487 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.213523 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.213540 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.317656 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.317723 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.317735 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.317756 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.317772 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.395990 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 01:47:15.710102624 +0000 UTC Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.409487 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:04 crc kubenswrapper[4721]: E0202 13:02:04.409620 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.420334 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.420359 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.420367 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.420381 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.420392 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.524123 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.524160 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.524173 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.524188 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.524199 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.626723 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.626757 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.626769 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.626784 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.626795 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.729803 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.729852 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.729863 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.729887 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.729899 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.832277 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.832313 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.832323 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.832336 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.832354 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.934847 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.934888 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.934899 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.934914 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:04 crc kubenswrapper[4721]: I0202 13:02:04.934927 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:04Z","lastTransitionTime":"2026-02-02T13:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.037707 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.038013 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.038141 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.038254 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.038360 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.140569 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.140634 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.140645 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.140666 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.140679 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.243453 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.243516 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.243528 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.243550 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.243564 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.346407 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.346484 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.346505 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.346529 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.346544 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.397027 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 09:01:25.734661984 +0000 UTC Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.409438 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.409476 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.409438 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:05 crc kubenswrapper[4721]: E0202 13:02:05.409610 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:05 crc kubenswrapper[4721]: E0202 13:02:05.409684 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:05 crc kubenswrapper[4721]: E0202 13:02:05.409846 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.410753 4721 scope.go:117] "RemoveContainer" containerID="6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1" Feb 02 13:02:05 crc kubenswrapper[4721]: E0202 13:02:05.411107 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.449840 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.449929 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.449955 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.450010 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.450029 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.552910 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.552958 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.552970 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.552991 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.553005 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.655830 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.656313 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.656602 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.656917 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.657190 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.760703 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.760755 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.760767 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.760783 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.760796 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.862573 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.862832 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.862930 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.863027 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.863141 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.965243 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.965305 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.965322 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.965369 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:05 crc kubenswrapper[4721]: I0202 13:02:05.965386 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:05Z","lastTransitionTime":"2026-02-02T13:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.068541 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.068848 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.068929 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.069018 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.069148 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.172727 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.172773 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.172781 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.172796 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.172805 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.275827 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.276176 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.276271 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.276375 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.276442 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.379957 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.380021 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.380040 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.380104 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.380126 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.397259 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:07:18.786488896 +0000 UTC Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.409507 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:06 crc kubenswrapper[4721]: E0202 13:02:06.409681 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.482317 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.482346 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.482358 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.482369 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.482378 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.578359 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.578421 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.578434 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.578456 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.578468 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: E0202 13:02:06.592600 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:06Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.597895 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.597949 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.597967 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.597990 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.598006 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: E0202 13:02:06.615662 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:06Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.619762 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.619806 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.619816 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.619836 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.619847 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: E0202 13:02:06.632551 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:06Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.636487 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.636533 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.636567 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.636593 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.636610 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: E0202 13:02:06.649714 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:06Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.653543 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.653580 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.653591 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.653606 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.653620 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: E0202 13:02:06.669300 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:06Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:06 crc kubenswrapper[4721]: E0202 13:02:06.669498 4721 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.671139 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.671214 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.671231 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.671249 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.671262 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.774672 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.775110 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.775332 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.775556 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.775721 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.878417 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.878678 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.878772 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.878862 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.879218 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.982210 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.982260 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.982271 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.982289 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:06 crc kubenswrapper[4721]: I0202 13:02:06.982300 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:06Z","lastTransitionTime":"2026-02-02T13:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.084974 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.085024 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.085037 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.085055 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.085088 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:07Z","lastTransitionTime":"2026-02-02T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.187456 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.187539 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.187551 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.187569 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.187586 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:07Z","lastTransitionTime":"2026-02-02T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.290388 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.290449 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.290463 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.290485 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.290500 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:07Z","lastTransitionTime":"2026-02-02T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.393267 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.393321 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.393341 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.393364 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.393378 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:07Z","lastTransitionTime":"2026-02-02T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.398556 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 00:38:47.412392037 +0000 UTC Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.408985 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.409085 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:07 crc kubenswrapper[4721]: E0202 13:02:07.409125 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.409085 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:07 crc kubenswrapper[4721]: E0202 13:02:07.409237 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:07 crc kubenswrapper[4721]: E0202 13:02:07.409345 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.496442 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.496488 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.496498 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.496514 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.496524 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:07Z","lastTransitionTime":"2026-02-02T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.599062 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.599125 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.599137 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.599155 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.599167 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:07Z","lastTransitionTime":"2026-02-02T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.701116 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.701150 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.701159 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.701172 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.701182 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:07Z","lastTransitionTime":"2026-02-02T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.803941 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.803977 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.803986 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.803999 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.804017 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:07Z","lastTransitionTime":"2026-02-02T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.907961 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.908021 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.908034 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.908055 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:07 crc kubenswrapper[4721]: I0202 13:02:07.908091 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:07Z","lastTransitionTime":"2026-02-02T13:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.010583 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.010646 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.010661 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.010682 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.010693 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.112535 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.112574 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.112584 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.112600 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.112610 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.214967 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.215413 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.215498 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.215575 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.215720 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.318360 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.318564 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.318692 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.318795 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.318869 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.398962 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 06:09:59.011486593 +0000 UTC Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.408889 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:08 crc kubenswrapper[4721]: E0202 13:02:08.409025 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.421091 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.421134 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.421145 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.421162 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.421174 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.523613 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.523658 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.523668 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.523683 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.523698 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.626330 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.626410 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.626433 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.626461 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.626478 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.729098 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.729438 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.729520 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.729599 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.729738 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.832212 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.832488 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.832584 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.832799 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.833053 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.940010 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.940057 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.940094 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.940114 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:08 crc kubenswrapper[4721]: I0202 13:02:08.940129 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:08Z","lastTransitionTime":"2026-02-02T13:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.044494 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.044564 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.044574 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.044594 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.044606 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.147735 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.147802 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.147822 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.147843 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.147856 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.250983 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.251045 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.251057 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.251102 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.251116 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.354186 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.354268 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.354279 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.354300 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.354312 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.400568 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 00:34:18.870162974 +0000 UTC Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.409111 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.409237 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:09 crc kubenswrapper[4721]: E0202 13:02:09.409318 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.409111 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:09 crc kubenswrapper[4721]: E0202 13:02:09.409395 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:09 crc kubenswrapper[4721]: E0202 13:02:09.409595 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.457581 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.457645 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.457659 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.457687 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.457706 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.560018 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.560084 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.560097 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.560115 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.560128 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.663476 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.663546 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.663558 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.663579 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.663595 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.765969 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.766021 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.766033 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.766051 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.766062 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.869589 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.869659 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.869697 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.869718 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.869732 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.972655 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.972712 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.972723 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.972736 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:09 crc kubenswrapper[4721]: I0202 13:02:09.972745 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:09Z","lastTransitionTime":"2026-02-02T13:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.075450 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.075519 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.075528 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.075549 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.075565 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:10Z","lastTransitionTime":"2026-02-02T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.178438 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.178508 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.178526 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.178555 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.178575 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:10Z","lastTransitionTime":"2026-02-02T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.281987 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.282032 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.282044 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.282079 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.282090 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:10Z","lastTransitionTime":"2026-02-02T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.384548 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.384595 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.384607 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.384623 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.384635 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:10Z","lastTransitionTime":"2026-02-02T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.401105 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 12:19:42.97536295 +0000 UTC Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.409549 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:10 crc kubenswrapper[4721]: E0202 13:02:10.409697 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.423628 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e7b999-d47e-409d-bafe-ffde8c03995e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3b10d453f884ce1095dd9bb0f8d91ebd466c4e2771a3a425abbe3001084cb09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da1bc82fcd0af4a3c1e3508fe525c95017c03f521388a28e7eaad2bb5dbb0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8797e4a71975106b9aa6c6519ef0d64047df3b21e3164f586b1441f5897e0e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.443289 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.455215 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.477297 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.487164 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.487197 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.487209 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.487242 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.487255 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:10Z","lastTransitionTime":"2026-02-02T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.492832 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.511979 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.523842 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.536650 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.551550 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.572956 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.589332 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.589541 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.589724 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.589767 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.589798 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.589840 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:10Z","lastTransitionTime":"2026-02-02T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.605442 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.621522 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.639944 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.658564 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.674491 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.688588 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.692392 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.692419 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.692427 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.692440 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.692450 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:10Z","lastTransitionTime":"2026-02-02T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.704420 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:10Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.794964 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.795007 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.795016 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.795035 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.795043 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:10Z","lastTransitionTime":"2026-02-02T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.898059 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.898118 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.898130 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.898144 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:10 crc kubenswrapper[4721]: I0202 13:02:10.898157 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:10Z","lastTransitionTime":"2026-02-02T13:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.001057 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.001126 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.001138 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.001153 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.001163 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.103290 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.103335 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.103345 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.103360 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.103370 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.205652 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.205689 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.205698 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.205711 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.205721 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.309033 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.309103 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.309118 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.309140 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.309156 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.401279 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 08:17:58.924369139 +0000 UTC Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.409775 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.409776 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:11 crc kubenswrapper[4721]: E0202 13:02:11.409957 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:11 crc kubenswrapper[4721]: E0202 13:02:11.410140 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.410669 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:11 crc kubenswrapper[4721]: E0202 13:02:11.410808 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.411918 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.411965 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.411973 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.411988 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.411997 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.515291 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.515353 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.515550 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.515567 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.515578 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.618520 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.618570 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.618583 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.618601 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.618613 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.721189 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.721259 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.721272 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.721291 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.721303 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.824112 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.824168 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.824185 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.824207 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.824221 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.926980 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.927049 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.927095 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.927119 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.927132 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:11Z","lastTransitionTime":"2026-02-02T13:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:11 crc kubenswrapper[4721]: I0202 13:02:11.941779 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:11 crc kubenswrapper[4721]: E0202 13:02:11.942030 4721 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:02:11 crc kubenswrapper[4721]: E0202 13:02:11.942150 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs podName:bfab3ffb-8798-423d-9b55-83868b76a14e nodeName:}" failed. No retries permitted until 2026-02-02 13:02:43.942125986 +0000 UTC m=+104.244640405 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs") pod "network-metrics-daemon-xqz79" (UID: "bfab3ffb-8798-423d-9b55-83868b76a14e") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.032174 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.032232 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.032249 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.032272 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.032287 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.135239 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.135281 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.135291 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.135309 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.135354 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.237580 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.237616 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.237624 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.237636 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.237646 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.340248 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.340306 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.340317 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.340342 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.340355 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.401834 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 12:04:34.543556836 +0000 UTC Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.425469 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:12 crc kubenswrapper[4721]: E0202 13:02:12.425633 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.442913 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.442936 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.442943 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.442953 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.442962 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.545836 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.545887 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.545897 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.545912 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.545922 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.648684 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.648740 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.648756 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.648779 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.648797 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.751212 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.751260 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.751269 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.751284 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.751295 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.853206 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.853254 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.853265 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.853281 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.853290 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.955939 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.955977 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.955986 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.956000 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:12 crc kubenswrapper[4721]: I0202 13:02:12.956011 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:12Z","lastTransitionTime":"2026-02-02T13:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.058676 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.058783 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.058806 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.058834 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.058851 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.161907 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.161969 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.161979 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.162000 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.162020 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.264671 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.264721 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.264736 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.264755 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.264769 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.366879 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.366922 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.366932 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.366946 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.366956 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.402769 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 12:58:58.345738664 +0000 UTC Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.409245 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.409261 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:13 crc kubenswrapper[4721]: E0202 13:02:13.409478 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.409268 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:13 crc kubenswrapper[4721]: E0202 13:02:13.409606 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:13 crc kubenswrapper[4721]: E0202 13:02:13.409676 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.469514 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.469561 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.469576 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.469594 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.469607 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.572125 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.572197 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.572209 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.572228 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.572241 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.675204 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.675252 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.675264 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.675281 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.675292 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.778157 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.778448 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.778466 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.778485 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.778498 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.881055 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.881119 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.881129 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.881145 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.881156 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.885118 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/0.log" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.885226 4721 generic.go:334] "Generic (PLEG): container finished" podID="5ba84858-caaa-4fba-8eaf-9f7ddece0b3a" containerID="0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6" exitCode=1 Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.885272 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltw7d" event={"ID":"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a","Type":"ContainerDied","Data":"0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.885966 4721 scope.go:117] "RemoveContainer" containerID="0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.898194 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:13Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.915355 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:13Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.939203 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:13Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.956295 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:13Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.975035 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:13Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.983556 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.983631 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.983642 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.983667 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.983681 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:13Z","lastTransitionTime":"2026-02-02T13:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:13 crc kubenswrapper[4721]: I0202 13:02:13.994835 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:13Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.017030 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:02:13Z\\\",\\\"message\\\":\\\"2026-02-02T13:01:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f\\\\n2026-02-02T13:01:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f to /host/opt/cni/bin/\\\\n2026-02-02T13:01:28Z [verbose] multus-daemon started\\\\n2026-02-02T13:01:28Z [verbose] Readiness Indicator file check\\\\n2026-02-02T13:02:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.032772 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.048200 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.061750 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.082707 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.086569 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.086620 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.086629 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.086648 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.086658 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:14Z","lastTransitionTime":"2026-02-02T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.095537 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e7b999-d47e-409d-bafe-ffde8c03995e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3b10d453f884ce1095dd9bb0f8d91ebd466c4e2771a3a425abbe3001084cb09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da1bc82fcd0af4a3c1e3508fe525c95017c03f521388a28e7eaad2bb5dbb0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8797e4a71975106b9aa6c6519ef0d64047df3b21e3164f586b1441f5897e0e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.110627 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.122820 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.135440 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.149222 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.169043 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.184830 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.189090 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.189135 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.189148 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.189171 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.189184 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:14Z","lastTransitionTime":"2026-02-02T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.291428 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.291462 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.291471 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.291485 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.291494 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:14Z","lastTransitionTime":"2026-02-02T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.394287 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.394328 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.394339 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.394357 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.394370 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:14Z","lastTransitionTime":"2026-02-02T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.403859 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 14:16:00.081658428 +0000 UTC Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.409187 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:14 crc kubenswrapper[4721]: E0202 13:02:14.409330 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.496725 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.496755 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.496765 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.496783 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.496796 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:14Z","lastTransitionTime":"2026-02-02T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.598971 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.599002 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.599012 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.599025 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.599034 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:14Z","lastTransitionTime":"2026-02-02T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.701639 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.701670 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.701678 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.701692 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.701700 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:14Z","lastTransitionTime":"2026-02-02T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.804313 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.804366 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.804380 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.804402 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.804417 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:14Z","lastTransitionTime":"2026-02-02T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.891569 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/0.log" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.891634 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltw7d" event={"ID":"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a","Type":"ContainerStarted","Data":"3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.908915 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.908955 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.908964 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.908982 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.908993 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:14Z","lastTransitionTime":"2026-02-02T13:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.909915 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.927234 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.945099 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.960378 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:14 crc kubenswrapper[4721]: I0202 13:02:14.976243 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:14Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.012467 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.012769 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.012789 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.012811 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.012829 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.013032 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.031797 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.050518 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.066054 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.083602 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.099613 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.114036 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.115765 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.115796 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.115807 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.115824 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.115838 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.129015 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:02:13Z\\\",\\\"message\\\":\\\"2026-02-02T13:01:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f\\\\n2026-02-02T13:01:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f to /host/opt/cni/bin/\\\\n2026-02-02T13:01:28Z [verbose] multus-daemon started\\\\n2026-02-02T13:01:28Z [verbose] Readiness Indicator file check\\\\n2026-02-02T13:02:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.146970 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.159862 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e7b999-d47e-409d-bafe-ffde8c03995e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3b10d453f884ce1095dd9bb0f8d91ebd466c4e2771a3a425abbe3001084cb09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da1bc82fcd0af4a3c1e3508fe525c95017c03f521388a28e7eaad2bb5dbb0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8797e4a71975106b9aa6c6519ef0d64047df3b21e3164f586b1441f5897e0e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.171810 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.187632 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.211324 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:15Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.219447 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.219499 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.219523 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.219553 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.219574 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.322571 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.322605 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.322617 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.322631 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.322643 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.404817 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 05:32:21.501849207 +0000 UTC Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.409300 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:15 crc kubenswrapper[4721]: E0202 13:02:15.409464 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.409550 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:15 crc kubenswrapper[4721]: E0202 13:02:15.409619 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.409672 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:15 crc kubenswrapper[4721]: E0202 13:02:15.409727 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.425530 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.425570 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.425586 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.425608 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.425624 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.529016 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.529114 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.529137 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.529164 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.529187 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.631875 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.631934 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.631948 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.631968 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.631984 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.735473 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.735563 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.735587 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.735613 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.735645 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.839000 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.839164 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.839182 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.839242 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.839270 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.942790 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.942834 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.942850 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.942870 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:15 crc kubenswrapper[4721]: I0202 13:02:15.942885 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:15Z","lastTransitionTime":"2026-02-02T13:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.045581 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.045631 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.045643 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.045658 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.045670 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.150394 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.150464 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.150482 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.150505 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.150523 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.254254 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.254331 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.254353 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.254383 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.254408 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.357215 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.357274 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.357289 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.357309 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.357324 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.405764 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 12:34:27.214053924 +0000 UTC Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.409264 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:16 crc kubenswrapper[4721]: E0202 13:02:16.409492 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.460193 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.460274 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.460298 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.460332 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.460359 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.563729 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.563797 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.563819 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.563847 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.563867 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.665964 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.666049 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.666131 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.666167 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.666197 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.768667 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.768727 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.768741 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.768758 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.768771 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.846827 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.846887 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.846900 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.846924 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.846948 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: E0202 13:02:16.867493 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:16Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.872618 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.872662 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.872675 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.872693 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.872705 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: E0202 13:02:16.890326 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:16Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.894628 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.894672 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.894684 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.894701 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.894712 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: E0202 13:02:16.916484 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:16Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.920270 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.920330 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.920341 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.920359 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.920372 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: E0202 13:02:16.933827 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:16Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.937415 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.937450 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.937461 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.937477 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.937489 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:16 crc kubenswrapper[4721]: E0202 13:02:16.951248 4721 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a135069f-e3e0-400d-820a-6a71848b843d\\\",\\\"systemUUID\\\":\\\"a18387c5-ff07-4cdd-8a5b-70ab978f8648\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:16Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:16 crc kubenswrapper[4721]: E0202 13:02:16.951728 4721 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.953687 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.953720 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.953730 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.953748 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:16 crc kubenswrapper[4721]: I0202 13:02:16.953759 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:16Z","lastTransitionTime":"2026-02-02T13:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.056343 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.056431 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.056445 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.056461 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.056473 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.158607 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.158665 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.158678 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.158699 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.158712 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.260591 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.260634 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.260648 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.260666 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.260681 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.363226 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.363287 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.363312 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.363333 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.363348 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.406928 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 13:55:37.409686528 +0000 UTC Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.409303 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.409345 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.409420 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:17 crc kubenswrapper[4721]: E0202 13:02:17.409464 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:17 crc kubenswrapper[4721]: E0202 13:02:17.409576 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:17 crc kubenswrapper[4721]: E0202 13:02:17.409668 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.466601 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.466671 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.466693 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.466723 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.466744 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.569902 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.569943 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.569953 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.569971 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.569983 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.672573 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.672607 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.672615 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.672628 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.672637 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.775944 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.776011 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.776031 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.776057 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.776105 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.879919 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.880323 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.880523 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.880711 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.880888 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.983024 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.983132 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.983158 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.983189 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:17 crc kubenswrapper[4721]: I0202 13:02:17.983212 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:17Z","lastTransitionTime":"2026-02-02T13:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.086127 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.086192 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.086213 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.086237 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.086254 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:18Z","lastTransitionTime":"2026-02-02T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.188816 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.188879 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.188891 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.188905 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.188914 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:18Z","lastTransitionTime":"2026-02-02T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.291409 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.291460 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.291471 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.291490 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.291501 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:18Z","lastTransitionTime":"2026-02-02T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.394855 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.394896 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.394907 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.394928 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.394945 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:18Z","lastTransitionTime":"2026-02-02T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.407145 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 12:52:20.436345537 +0000 UTC Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.409584 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:18 crc kubenswrapper[4721]: E0202 13:02:18.409788 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.497624 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.497684 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.497701 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.497725 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.497742 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:18Z","lastTransitionTime":"2026-02-02T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.600871 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.600963 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.601004 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.601041 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.601163 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:18Z","lastTransitionTime":"2026-02-02T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.704720 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.704832 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.704853 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.704881 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.704901 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:18Z","lastTransitionTime":"2026-02-02T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.807805 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.807877 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.807896 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.807922 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.807942 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:18Z","lastTransitionTime":"2026-02-02T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.911005 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.911140 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.911161 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.911189 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:18 crc kubenswrapper[4721]: I0202 13:02:18.911206 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:18Z","lastTransitionTime":"2026-02-02T13:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.014805 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.014873 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.014889 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.014912 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.014930 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.118250 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.118319 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.118336 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.118361 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.118381 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.221910 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.222373 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.222566 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.222813 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.223473 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.327060 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.327481 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.327686 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.328574 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.328753 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.408010 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 13:27:37.926099986 +0000 UTC Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.409375 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.409391 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.409478 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:19 crc kubenswrapper[4721]: E0202 13:02:19.409797 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:19 crc kubenswrapper[4721]: E0202 13:02:19.410224 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:19 crc kubenswrapper[4721]: E0202 13:02:19.410313 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.411375 4721 scope.go:117] "RemoveContainer" containerID="6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.432269 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.432341 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.432360 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.432389 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.432409 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.536131 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.536203 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.536219 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.536241 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.536261 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.638766 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.638799 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.638836 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.638853 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.638970 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.742397 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.742442 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.742454 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.742475 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.742494 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.845579 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.845613 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.845628 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.845643 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.845655 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.917421 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/2.log" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.919953 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.920405 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.933242 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:19Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.944343 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:19Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.949510 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.949572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.949590 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.949609 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.949628 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:19Z","lastTransitionTime":"2026-02-02T13:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.957918 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:19Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.969959 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:19Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.982898 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:19Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:19 crc kubenswrapper[4721]: I0202 13:02:19.994830 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:19Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.019685 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.040572 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.052167 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.052205 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.052214 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.052228 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.052238 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.052855 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.064940 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.077041 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:02:13Z\\\",\\\"message\\\":\\\"2026-02-02T13:01:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f\\\\n2026-02-02T13:01:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f to /host/opt/cni/bin/\\\\n2026-02-02T13:01:28Z [verbose] multus-daemon started\\\\n2026-02-02T13:01:28Z [verbose] Readiness Indicator file check\\\\n2026-02-02T13:02:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.089745 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.102640 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.116157 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.135756 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.148350 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e7b999-d47e-409d-bafe-ffde8c03995e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3b10d453f884ce1095dd9bb0f8d91ebd466c4e2771a3a425abbe3001084cb09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da1bc82fcd0af4a3c1e3508fe525c95017c03f521388a28e7eaad2bb5dbb0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8797e4a71975106b9aa6c6519ef0d64047df3b21e3164f586b1441f5897e0e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.154380 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.154513 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.154589 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.154650 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.154711 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.162638 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.174720 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.257859 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.257913 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.257928 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.257949 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.257964 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.360680 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.360716 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.360725 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.360739 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.360750 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.408685 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 12:52:54.190244514 +0000 UTC Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.408726 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:20 crc kubenswrapper[4721]: E0202 13:02:20.408945 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.434101 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.451145 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.467449 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.467533 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.467543 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.467567 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.467577 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.468389 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.488684 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:02:13Z\\\",\\\"message\\\":\\\"2026-02-02T13:01:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f\\\\n2026-02-02T13:01:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f to /host/opt/cni/bin/\\\\n2026-02-02T13:01:28Z [verbose] multus-daemon started\\\\n2026-02-02T13:01:28Z [verbose] Readiness Indicator file check\\\\n2026-02-02T13:02:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.506265 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.524563 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e7b999-d47e-409d-bafe-ffde8c03995e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3b10d453f884ce1095dd9bb0f8d91ebd466c4e2771a3a425abbe3001084cb09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da1bc82fcd0af4a3c1e3508fe525c95017c03f521388a28e7eaad2bb5dbb0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8797e4a71975106b9aa6c6519ef0d64047df3b21e3164f586b1441f5897e0e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.545575 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.563913 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.570672 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.570723 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.570735 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.570749 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.570760 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.588678 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.605350 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.623750 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.640849 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.654096 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.668313 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.673026 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.673089 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.673106 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.673125 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.673138 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.691337 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.712617 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.725935 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.740286 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.776256 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.776316 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.776329 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.776345 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.776355 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.879180 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.879570 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.879587 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.879607 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.879620 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.925908 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/3.log" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.926666 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/2.log" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.930544 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29" exitCode=1 Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.930604 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.930670 4721 scope.go:117] "RemoveContainer" containerID="6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.931836 4721 scope.go:117] "RemoveContainer" containerID="0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29" Feb 02 13:02:20 crc kubenswrapper[4721]: E0202 13:02:20.932174 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.948383 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.959350 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.972084 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.982405 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.982446 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.982459 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.982477 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.982491 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:20Z","lastTransitionTime":"2026-02-02T13:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:20 crc kubenswrapper[4721]: I0202 13:02:20.991729 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.010428 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.025923 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.041015 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.056637 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:02:13Z\\\",\\\"message\\\":\\\"2026-02-02T13:01:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f\\\\n2026-02-02T13:01:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f to /host/opt/cni/bin/\\\\n2026-02-02T13:01:28Z [verbose] multus-daemon started\\\\n2026-02-02T13:01:28Z [verbose] Readiness Indicator file check\\\\n2026-02-02T13:02:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.072037 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.085133 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.085183 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.085196 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.085214 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.085227 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:21Z","lastTransitionTime":"2026-02-02T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.086720 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.100167 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.131442 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6a1a04dffbd006aaa015d90a67b9f605aea78ad5e0c0dd17cff95375c2bc73f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:01:52Z\\\",\\\"message\\\":\\\"map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 13:01:52.319818 6399 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-48kgl\\\\nI0202 13:01:52.319835 6399 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0202 13:01:52.319869 6399 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:02:20Z\\\",\\\"message\\\":\\\"former during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z]\\\\nI0202 13:02:20.256797 6792 services_controller.go:443] Built service openshift-marketplace/marketplace-operator-metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.53\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8383, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.53\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8081, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodeP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.144666 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e7b999-d47e-409d-bafe-ffde8c03995e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3b10d453f884ce1095dd9bb0f8d91ebd466c4e2771a3a425abbe3001084cb09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da1bc82fcd0af4a3c1e3508fe525c95017c03f521388a28e7eaad2bb5dbb0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8797e4a71975106b9aa6c6519ef0d64047df3b21e3164f586b1441f5897e0e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.160757 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.172335 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.184144 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.187468 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.187541 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.187566 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.188030 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.188370 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:21Z","lastTransitionTime":"2026-02-02T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.195207 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.208221 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.292393 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.292462 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.292481 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.292960 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.293014 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:21Z","lastTransitionTime":"2026-02-02T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.395468 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.395523 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.395540 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.395564 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.395581 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:21Z","lastTransitionTime":"2026-02-02T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.409026 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.409127 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.409115 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 09:03:31.163576141 +0000 UTC Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.409032 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:21 crc kubenswrapper[4721]: E0202 13:02:21.409262 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:21 crc kubenswrapper[4721]: E0202 13:02:21.409352 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:21 crc kubenswrapper[4721]: E0202 13:02:21.409437 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.498497 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.498553 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.498572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.498601 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.498637 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:21Z","lastTransitionTime":"2026-02-02T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.601321 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.601417 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.601457 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.601488 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.601512 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:21Z","lastTransitionTime":"2026-02-02T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.704776 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.704828 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.704839 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.704856 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.704871 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:21Z","lastTransitionTime":"2026-02-02T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.808227 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.808279 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.808295 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.808318 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.808334 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:21Z","lastTransitionTime":"2026-02-02T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.911513 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.911554 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.911564 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.911580 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.911593 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:21Z","lastTransitionTime":"2026-02-02T13:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.936276 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/3.log" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.940254 4721 scope.go:117] "RemoveContainer" containerID="0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29" Feb 02 13:02:21 crc kubenswrapper[4721]: E0202 13:02:21.940384 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.960340 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://adb26aa4f183a30ed2171ec82ffe1dd18441edca9ef1c82be5de119b9f5dfeac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea451bfae69b1e106c048c055efa12ca4703ed50c5a896c96249667bc894be68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.976651 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://186eb66a27288d345e91e727cc75eedd075c4b2c53f34b84f11c781a35013221\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:21 crc kubenswrapper[4721]: I0202 13:02:21.991677 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-48kgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09b4ea41-ceb5-481a-899e-c2876ced6d49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf615344c715b3b5e29df415e722aa0a79e0f5920a112b9dc5fb2e59edb8e02a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8p6j7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-48kgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:21Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.005620 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ecfc8a9e-993f-494a-ba91-4132345cee05\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02e9bc6a3dd0746c09c22e858873a85ac0fd67fe903dfa6f2cd5061020ce5230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c7004e0fbbf58f24a8539f9e323fdba3cbe648aff6b6c7be443d765260a148\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js5c8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sz7tc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.014438 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.014465 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.014473 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.014486 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.014496 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.018424 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.031317 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.042311 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sgp8m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13f3ec54-a7fb-4236-9583-827d960b2086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac79b84f660c3d88abf5333666a0c58660017ec7a5a73236fdbec2a66d3581e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n42qk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sgp8m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.054141 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xqz79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfab3ffb-8798-423d-9b55-83868b76a14e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbwws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xqz79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.075443 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ecae132-72a0-444f-a8a0-6b66ac020b8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f90327e648588b02fa6cbf66e4669d08eca0fec9e983eba6b51b1b6255a974fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6636c906b1c4974addbe6737ada10dc7b1e7e1d26030bc353b26e6ea9cbf79c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4ad0c06550cf5333819bbc66fe88e308352e5b3731fb7589908cf0513a1779e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99a0afebb867064e3383aa667730cb4ee786878a48cea87c6537dc4da6e782aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b1f0c247d2735f3222f768de49245a1b86e12d508d1ac4f470cef381ababf90\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6c33bc8b83864e16f12e0fdaccef704118b6bbc2dfec4dc4c6c14130988ef4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c04d33bb133ee7e08e972bd10e814b5407f058a499d339d9b022d64712d02b6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d8c39f3a470d594d6a1c9bcd759c4fee9704d429712dca6f11c37e40c9863f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.093513 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0939893-cc01-45bf-844d-77d599d4d0a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 13:01:03.744523 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 13:01:03.746892 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-647403325/tls.crt::/tmp/serving-cert-647403325/tls.key\\\\\\\"\\\\nI0202 13:01:19.092031 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 13:01:19.097108 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 13:01:19.097147 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 13:01:19.097179 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 13:01:19.097190 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 13:01:19.110409 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 13:01:19.110441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110446 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 13:01:19.110451 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0202 13:01:19.110454 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0202 13:01:19.110464 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 13:01:19.110471 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 13:01:19.110474 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0202 13:01:19.113447 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.110038 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22919e36d401d7d15a12233a844fc73fadb81e6729ee5bf13075363ed0fb6443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.116572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.116605 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.116616 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.116631 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.116643 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.124587 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.142052 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ltw7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:02:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:02:13Z\\\",\\\"message\\\":\\\"2026-02-02T13:01:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f\\\\n2026-02-02T13:01:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7c44ae0c-2ffe-42d2-9e4e-a4f37c32136f to /host/opt/cni/bin/\\\\n2026-02-02T13:01:28Z [verbose] multus-daemon started\\\\n2026-02-02T13:01:28Z [verbose] Readiness Indicator file check\\\\n2026-02-02T13:02:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:02:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dchqf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ltw7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.160434 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"04b1629d-0184-4975-8d4b-7a32913e7389\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef571a7e891a1eb9aa9c8d0c3590e57c27aed2b369226612798dfa8e9fff15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://300219467a5463b61c541d43a8a1e0a52832275c0ea9f25365ae171e7a9917d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1be4dc7e5fc10c66010fefe026d707139d3f367e23fe6e3802af466e9be1cadb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c5db5fa2480ff66ab3f6dedf8ff0657f749ae860659c7efb40c49b03e39fe8b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://beb6ee1be2b65f10751a3a75dae3c008fbb7a1a2a686c16b579f1abcce745c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2716ccd45d9db0c7f951189bed607322fc0a59641212923e2b71ffac2ba17868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d102ac76b81834338d5502d360272956cd2364b424a32a6d90a5e9bfedeeb2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2tcn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-x4lhg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.180272 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9428a242-fbf8-46a8-8527-4f85b5dc560b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27f73cad17b65a844c92ff27f1bb25dc91ee023461161b0418a23912bd186c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://374718be2e182a95c339dcbf987fdaa9a52ffc00f7c892bab39c27807b2e825a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f34214c3fbc902846b91b8e887ee4332538456bb0caf567fbb85636791af1b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.195874 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45de494a1ced27830ad6e4acb334966f48c89aab01696558e481b174c6362e4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nm6j6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-rppjz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.218879 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.218920 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.218933 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.218950 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.218963 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.230007 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T13:02:20Z\\\",\\\"message\\\":\\\"former during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:20Z is after 2025-08-24T17:21:41Z]\\\\nI0202 13:02:20.256797 6792 services_controller.go:443] Built service openshift-marketplace/marketplace-operator-metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.53\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8383, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}, services.lbConfig{vips:[]string{\\\\\\\"10.217.5.53\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:8081, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodeP\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T13:02:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88f6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pwcs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.244234 4721 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09e7b999-d47e-409d-bafe-ffde8c03995e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T13:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3b10d453f884ce1095dd9bb0f8d91ebd466c4e2771a3a425abbe3001084cb09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da1bc82fcd0af4a3c1e3508fe525c95017c03f521388a28e7eaad2bb5dbb0d9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8797e4a71975106b9aa6c6519ef0d64047df3b21e3164f586b1441f5897e0e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T13:01:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e8c9b4927e7df11ad15f3436346ceef587ef7e5e8716d26bd2e9b2000f9bc16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T13:01:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T13:01:01Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T13:01:00Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T13:02:22Z is after 2025-08-24T17:21:41Z" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.321445 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.321495 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.321506 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.321523 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.321535 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.409430 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 08:56:51.399459759 +0000 UTC Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.409646 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:22 crc kubenswrapper[4721]: E0202 13:02:22.409843 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.423633 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.423684 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.423701 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.423723 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.423741 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.526905 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.526953 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.526968 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.526989 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.527005 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.630459 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.630537 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.630561 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.630592 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.630612 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.733843 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.733904 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.733922 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.733946 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.733961 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.836841 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.836884 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.836895 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.836913 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.836926 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.940842 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.940900 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.940923 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.940952 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:22 crc kubenswrapper[4721]: I0202 13:02:22.940975 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:22Z","lastTransitionTime":"2026-02-02T13:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.044426 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.044491 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.044508 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.044533 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.044553 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.052251 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.052409 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:27.052378078 +0000 UTC m=+147.354892507 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.147749 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.147843 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.147865 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.147896 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.147918 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.154559 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.154769 4721 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.154920 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:03:27.154884669 +0000 UTC m=+147.457399058 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.155005 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.155134 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.155190 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155331 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155399 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155451 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155475 4721 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155530 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 13:03:27.155512629 +0000 UTC m=+147.458027038 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155404 4721 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155613 4721 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155651 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 13:03:27.155642442 +0000 UTC m=+147.458156831 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155347 4721 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.155690 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 13:03:27.155681714 +0000 UTC m=+147.458196103 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.251307 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.251375 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.251392 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.251418 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.251436 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.354019 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.354139 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.354178 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.354208 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.354229 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.409293 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.409361 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.409391 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.409462 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.409507 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:23 crc kubenswrapper[4721]: E0202 13:02:23.409590 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.409613 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 10:45:22.241663052 +0000 UTC Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.456771 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.456843 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.456863 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.456890 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.457606 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.561219 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.561266 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.561286 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.561307 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.561325 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.664540 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.664574 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.664582 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.664595 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.664605 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.767212 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.767262 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.767273 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.767290 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.767301 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.869604 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.869676 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.869688 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.869706 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.869720 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.973509 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.973546 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.973556 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.973572 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:23 crc kubenswrapper[4721]: I0202 13:02:23.973583 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:23Z","lastTransitionTime":"2026-02-02T13:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.076019 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.076057 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.076086 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.076101 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.076115 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:24Z","lastTransitionTime":"2026-02-02T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.179175 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.179221 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.179235 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.179253 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.179264 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:24Z","lastTransitionTime":"2026-02-02T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.282445 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.282512 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.282535 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.282566 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.282588 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:24Z","lastTransitionTime":"2026-02-02T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.385445 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.385521 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.385546 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.385576 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.385599 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:24Z","lastTransitionTime":"2026-02-02T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.409283 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:24 crc kubenswrapper[4721]: E0202 13:02:24.409481 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.409755 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 01:09:51.679636108 +0000 UTC Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.487600 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.487647 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.487656 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.487672 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.487681 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:24Z","lastTransitionTime":"2026-02-02T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.590637 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.590703 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.590720 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.590745 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.590761 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:24Z","lastTransitionTime":"2026-02-02T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.693933 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.694007 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.694029 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.694056 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.694117 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:24Z","lastTransitionTime":"2026-02-02T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.797366 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.797428 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.797445 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.797468 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.797486 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:24Z","lastTransitionTime":"2026-02-02T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.900665 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.900713 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.900729 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.900750 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:24 crc kubenswrapper[4721]: I0202 13:02:24.900767 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:24Z","lastTransitionTime":"2026-02-02T13:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.003584 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.003663 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.003688 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.003718 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.003740 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.106506 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.106583 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.106609 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.106744 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.106765 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.210677 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.210738 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.210755 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.210780 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.210798 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.313578 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.313649 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.313671 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.313701 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.313724 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.409583 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:25 crc kubenswrapper[4721]: E0202 13:02:25.409767 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.409847 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.409913 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:25 crc kubenswrapper[4721]: E0202 13:02:25.409988 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.409982 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 01:06:21.215265486 +0000 UTC Feb 02 13:02:25 crc kubenswrapper[4721]: E0202 13:02:25.410219 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.416803 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.416865 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.416889 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.416918 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.416945 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.519814 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.519896 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.519936 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.519970 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.519992 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.623221 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.623291 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.623313 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.623337 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.623355 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.726454 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.726531 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.726552 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.726576 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.726594 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.830551 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.830632 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.830652 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.830678 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.830695 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.933725 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.933773 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.933792 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.933814 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:25 crc kubenswrapper[4721]: I0202 13:02:25.933831 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:25Z","lastTransitionTime":"2026-02-02T13:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.036945 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.037000 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.037018 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.037042 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.037064 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.139903 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.139962 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.139980 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.140002 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.140021 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.243488 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.243561 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.243575 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.243591 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.243625 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.347719 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.347870 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.347908 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.347940 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.347965 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.408904 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:26 crc kubenswrapper[4721]: E0202 13:02:26.409112 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.411154 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 15:30:01.517845089 +0000 UTC Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.451400 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.451438 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.451448 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.451464 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.451478 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.555180 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.555233 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.555244 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.555261 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.555274 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.659022 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.659143 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.659164 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.659189 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.659208 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.762456 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.762527 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.762549 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.762579 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.762602 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.865400 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.865463 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.865479 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.865501 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.865518 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.968253 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.968292 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.968301 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.968313 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:26 crc kubenswrapper[4721]: I0202 13:02:26.968322 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:26Z","lastTransitionTime":"2026-02-02T13:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.071153 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.071226 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.071252 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.071287 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.071309 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:27Z","lastTransitionTime":"2026-02-02T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.177646 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.177719 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.177737 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.177763 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.177781 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:27Z","lastTransitionTime":"2026-02-02T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.230331 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.230405 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.230424 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.230451 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.230469 4721 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T13:02:27Z","lastTransitionTime":"2026-02-02T13:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.314433 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt"] Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.314984 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.320844 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.320966 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.321400 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.321777 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.401061 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7a5f382b-2706-4d62-99b4-04ee8284bae5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.401122 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a5f382b-2706-4d62-99b4-04ee8284bae5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.401220 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7a5f382b-2706-4d62-99b4-04ee8284bae5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.401278 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a5f382b-2706-4d62-99b4-04ee8284bae5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.401357 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a5f382b-2706-4d62-99b4-04ee8284bae5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.402127 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.402113274 podStartE2EDuration="1m8.402113274s" podCreationTimestamp="2026-02-02 13:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.386612134 +0000 UTC m=+87.689126533" watchObservedRunningTime="2026-02-02 13:02:27.402113274 +0000 UTC m=+87.704627663" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.409355 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:27 crc kubenswrapper[4721]: E0202 13:02:27.409464 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.409663 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:27 crc kubenswrapper[4721]: E0202 13:02:27.409748 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.409885 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:27 crc kubenswrapper[4721]: E0202 13:02:27.409947 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.411342 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 12:18:21.502504074 +0000 UTC Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.411396 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.418309 4721 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.428991 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ltw7d" podStartSLOduration=62.42896776 podStartE2EDuration="1m2.42896776s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.428928589 +0000 UTC m=+87.731442988" watchObservedRunningTime="2026-02-02 13:02:27.42896776 +0000 UTC m=+87.731482159" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.446700 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x4lhg" podStartSLOduration=62.446679266 podStartE2EDuration="1m2.446679266s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.446540602 +0000 UTC m=+87.749055001" watchObservedRunningTime="2026-02-02 13:02:27.446679266 +0000 UTC m=+87.749193655" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.459685 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=34.459660141 podStartE2EDuration="34.459660141s" podCreationTimestamp="2026-02-02 13:01:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.459232789 +0000 UTC m=+87.761747188" watchObservedRunningTime="2026-02-02 13:02:27.459660141 +0000 UTC m=+87.762174550" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.473836 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=61.473820331 podStartE2EDuration="1m1.473820331s" podCreationTimestamp="2026-02-02 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.473047879 +0000 UTC m=+87.775562288" watchObservedRunningTime="2026-02-02 13:02:27.473820331 +0000 UTC m=+87.776334730" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.521637 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7a5f382b-2706-4d62-99b4-04ee8284bae5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.521684 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a5f382b-2706-4d62-99b4-04ee8284bae5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.521713 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7a5f382b-2706-4d62-99b4-04ee8284bae5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.521737 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a5f382b-2706-4d62-99b4-04ee8284bae5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.521759 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a5f382b-2706-4d62-99b4-04ee8284bae5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.521752 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/7a5f382b-2706-4d62-99b4-04ee8284bae5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.521809 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podStartSLOduration=62.521792464 podStartE2EDuration="1m2.521792464s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.487879358 +0000 UTC m=+87.790393767" watchObservedRunningTime="2026-02-02 13:02:27.521792464 +0000 UTC m=+87.824306853" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.521885 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/7a5f382b-2706-4d62-99b4-04ee8284bae5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.522646 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a5f382b-2706-4d62-99b4-04ee8284bae5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.533621 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a5f382b-2706-4d62-99b4-04ee8284bae5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.546130 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a5f382b-2706-4d62-99b4-04ee8284bae5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g2bpt\" (UID: \"7a5f382b-2706-4d62-99b4-04ee8284bae5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.582452 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-48kgl" podStartSLOduration=62.582432934 podStartE2EDuration="1m2.582432934s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.582394963 +0000 UTC m=+87.884909362" watchObservedRunningTime="2026-02-02 13:02:27.582432934 +0000 UTC m=+87.884947323" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.596895 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sz7tc" podStartSLOduration=61.596878553 podStartE2EDuration="1m1.596878553s" podCreationTimestamp="2026-02-02 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.596361418 +0000 UTC m=+87.898875817" watchObservedRunningTime="2026-02-02 13:02:27.596878553 +0000 UTC m=+87.899392942" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.639535 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=64.639510667 podStartE2EDuration="1m4.639510667s" podCreationTimestamp="2026-02-02 13:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.637525639 +0000 UTC m=+87.940040028" watchObservedRunningTime="2026-02-02 13:02:27.639510667 +0000 UTC m=+87.942025076" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.644021 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.679808 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sgp8m" podStartSLOduration=63.679787632 podStartE2EDuration="1m3.679787632s" podCreationTimestamp="2026-02-02 13:01:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.679244987 +0000 UTC m=+87.981759366" watchObservedRunningTime="2026-02-02 13:02:27.679787632 +0000 UTC m=+87.982302021" Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.965218 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" event={"ID":"7a5f382b-2706-4d62-99b4-04ee8284bae5","Type":"ContainerStarted","Data":"223e6da1547777d9020eeb86f08529f9ca9315c783ee8f377db090fd9a040e38"} Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.965680 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" event={"ID":"7a5f382b-2706-4d62-99b4-04ee8284bae5","Type":"ContainerStarted","Data":"21abe34615dab7183cfd84b4bccfa2d815de7b5e3fecd873a79ac6ca0ab37845"} Feb 02 13:02:27 crc kubenswrapper[4721]: I0202 13:02:27.990338 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g2bpt" podStartSLOduration=62.990315557 podStartE2EDuration="1m2.990315557s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:27.988638907 +0000 UTC m=+88.291153326" watchObservedRunningTime="2026-02-02 13:02:27.990315557 +0000 UTC m=+88.292829976" Feb 02 13:02:28 crc kubenswrapper[4721]: I0202 13:02:28.408683 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:28 crc kubenswrapper[4721]: E0202 13:02:28.408800 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:29 crc kubenswrapper[4721]: I0202 13:02:29.408683 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:29 crc kubenswrapper[4721]: I0202 13:02:29.408778 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:29 crc kubenswrapper[4721]: I0202 13:02:29.408683 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:29 crc kubenswrapper[4721]: E0202 13:02:29.408819 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:29 crc kubenswrapper[4721]: E0202 13:02:29.408922 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:29 crc kubenswrapper[4721]: E0202 13:02:29.409025 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:30 crc kubenswrapper[4721]: I0202 13:02:30.408713 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:30 crc kubenswrapper[4721]: E0202 13:02:30.409812 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:31 crc kubenswrapper[4721]: I0202 13:02:31.408549 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:31 crc kubenswrapper[4721]: I0202 13:02:31.408636 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:31 crc kubenswrapper[4721]: E0202 13:02:31.408715 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:31 crc kubenswrapper[4721]: I0202 13:02:31.408740 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:31 crc kubenswrapper[4721]: E0202 13:02:31.408841 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:31 crc kubenswrapper[4721]: E0202 13:02:31.408994 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:32 crc kubenswrapper[4721]: I0202 13:02:32.409247 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:32 crc kubenswrapper[4721]: E0202 13:02:32.409588 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:32 crc kubenswrapper[4721]: I0202 13:02:32.421302 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 02 13:02:33 crc kubenswrapper[4721]: I0202 13:02:33.408673 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:33 crc kubenswrapper[4721]: I0202 13:02:33.408711 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:33 crc kubenswrapper[4721]: E0202 13:02:33.408836 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:33 crc kubenswrapper[4721]: I0202 13:02:33.408926 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:33 crc kubenswrapper[4721]: E0202 13:02:33.409156 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:33 crc kubenswrapper[4721]: E0202 13:02:33.409232 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:34 crc kubenswrapper[4721]: I0202 13:02:34.409702 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:34 crc kubenswrapper[4721]: E0202 13:02:34.409944 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:35 crc kubenswrapper[4721]: I0202 13:02:35.409365 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:35 crc kubenswrapper[4721]: I0202 13:02:35.409447 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:35 crc kubenswrapper[4721]: I0202 13:02:35.409383 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:35 crc kubenswrapper[4721]: E0202 13:02:35.409536 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:35 crc kubenswrapper[4721]: E0202 13:02:35.409691 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:35 crc kubenswrapper[4721]: E0202 13:02:35.409813 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:36 crc kubenswrapper[4721]: I0202 13:02:36.408785 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:36 crc kubenswrapper[4721]: E0202 13:02:36.409321 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:37 crc kubenswrapper[4721]: I0202 13:02:37.409410 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:37 crc kubenswrapper[4721]: I0202 13:02:37.409448 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:37 crc kubenswrapper[4721]: E0202 13:02:37.409620 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:37 crc kubenswrapper[4721]: E0202 13:02:37.409738 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:37 crc kubenswrapper[4721]: I0202 13:02:37.410325 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:37 crc kubenswrapper[4721]: E0202 13:02:37.410480 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:37 crc kubenswrapper[4721]: I0202 13:02:37.410556 4721 scope.go:117] "RemoveContainer" containerID="0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29" Feb 02 13:02:37 crc kubenswrapper[4721]: E0202 13:02:37.410729 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" Feb 02 13:02:38 crc kubenswrapper[4721]: I0202 13:02:38.409404 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:38 crc kubenswrapper[4721]: E0202 13:02:38.409669 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:39 crc kubenswrapper[4721]: I0202 13:02:39.409252 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:39 crc kubenswrapper[4721]: I0202 13:02:39.409293 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:39 crc kubenswrapper[4721]: I0202 13:02:39.409257 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:39 crc kubenswrapper[4721]: E0202 13:02:39.409419 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:39 crc kubenswrapper[4721]: E0202 13:02:39.409548 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:39 crc kubenswrapper[4721]: E0202 13:02:39.409808 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:40 crc kubenswrapper[4721]: I0202 13:02:40.411291 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:40 crc kubenswrapper[4721]: E0202 13:02:40.411453 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:40 crc kubenswrapper[4721]: I0202 13:02:40.428789 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.428773328 podStartE2EDuration="8.428773328s" podCreationTimestamp="2026-02-02 13:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:02:40.428458088 +0000 UTC m=+100.730972477" watchObservedRunningTime="2026-02-02 13:02:40.428773328 +0000 UTC m=+100.731287717" Feb 02 13:02:41 crc kubenswrapper[4721]: I0202 13:02:41.408876 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:41 crc kubenswrapper[4721]: I0202 13:02:41.408931 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:41 crc kubenswrapper[4721]: E0202 13:02:41.409048 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:41 crc kubenswrapper[4721]: I0202 13:02:41.408881 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:41 crc kubenswrapper[4721]: E0202 13:02:41.409259 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:41 crc kubenswrapper[4721]: E0202 13:02:41.409274 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:42 crc kubenswrapper[4721]: I0202 13:02:42.409351 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:42 crc kubenswrapper[4721]: E0202 13:02:42.410426 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:43 crc kubenswrapper[4721]: I0202 13:02:43.408630 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:43 crc kubenswrapper[4721]: I0202 13:02:43.408639 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:43 crc kubenswrapper[4721]: I0202 13:02:43.408743 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:43 crc kubenswrapper[4721]: E0202 13:02:43.408841 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:43 crc kubenswrapper[4721]: E0202 13:02:43.408908 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:43 crc kubenswrapper[4721]: E0202 13:02:43.409150 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:44 crc kubenswrapper[4721]: I0202 13:02:44.028053 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:44 crc kubenswrapper[4721]: E0202 13:02:44.028325 4721 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:02:44 crc kubenswrapper[4721]: E0202 13:02:44.028398 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs podName:bfab3ffb-8798-423d-9b55-83868b76a14e nodeName:}" failed. No retries permitted until 2026-02-02 13:03:48.028375415 +0000 UTC m=+168.330889844 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs") pod "network-metrics-daemon-xqz79" (UID: "bfab3ffb-8798-423d-9b55-83868b76a14e") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 13:02:44 crc kubenswrapper[4721]: I0202 13:02:44.409112 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:44 crc kubenswrapper[4721]: E0202 13:02:44.409339 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:45 crc kubenswrapper[4721]: I0202 13:02:45.409045 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:45 crc kubenswrapper[4721]: I0202 13:02:45.409234 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:45 crc kubenswrapper[4721]: E0202 13:02:45.409294 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:45 crc kubenswrapper[4721]: I0202 13:02:45.409053 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:45 crc kubenswrapper[4721]: E0202 13:02:45.409482 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:45 crc kubenswrapper[4721]: E0202 13:02:45.409707 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:46 crc kubenswrapper[4721]: I0202 13:02:46.409264 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:46 crc kubenswrapper[4721]: E0202 13:02:46.409593 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:47 crc kubenswrapper[4721]: I0202 13:02:47.409281 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:47 crc kubenswrapper[4721]: I0202 13:02:47.409334 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:47 crc kubenswrapper[4721]: E0202 13:02:47.409441 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:47 crc kubenswrapper[4721]: I0202 13:02:47.409455 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:47 crc kubenswrapper[4721]: E0202 13:02:47.409560 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:47 crc kubenswrapper[4721]: E0202 13:02:47.409743 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:48 crc kubenswrapper[4721]: I0202 13:02:48.409475 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:48 crc kubenswrapper[4721]: E0202 13:02:48.409695 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:49 crc kubenswrapper[4721]: I0202 13:02:49.409374 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:49 crc kubenswrapper[4721]: I0202 13:02:49.409428 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:49 crc kubenswrapper[4721]: I0202 13:02:49.409440 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:49 crc kubenswrapper[4721]: E0202 13:02:49.409972 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:49 crc kubenswrapper[4721]: E0202 13:02:49.409805 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:49 crc kubenswrapper[4721]: E0202 13:02:49.410083 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:50 crc kubenswrapper[4721]: I0202 13:02:50.409676 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:50 crc kubenswrapper[4721]: E0202 13:02:50.411818 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:51 crc kubenswrapper[4721]: I0202 13:02:51.409630 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:51 crc kubenswrapper[4721]: I0202 13:02:51.409703 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:51 crc kubenswrapper[4721]: I0202 13:02:51.409749 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:51 crc kubenswrapper[4721]: E0202 13:02:51.410432 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:51 crc kubenswrapper[4721]: E0202 13:02:51.410483 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:51 crc kubenswrapper[4721]: E0202 13:02:51.410513 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:51 crc kubenswrapper[4721]: I0202 13:02:51.411537 4721 scope.go:117] "RemoveContainer" containerID="0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29" Feb 02 13:02:51 crc kubenswrapper[4721]: E0202 13:02:51.411820 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pwcs2_openshift-ovn-kubernetes(b15bc48d-f88d-4b38-a9e1-00bb00b88a52)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" Feb 02 13:02:52 crc kubenswrapper[4721]: I0202 13:02:52.409062 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:52 crc kubenswrapper[4721]: E0202 13:02:52.409264 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:53 crc kubenswrapper[4721]: I0202 13:02:53.408667 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:53 crc kubenswrapper[4721]: I0202 13:02:53.408765 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:53 crc kubenswrapper[4721]: I0202 13:02:53.408765 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:53 crc kubenswrapper[4721]: E0202 13:02:53.409380 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:53 crc kubenswrapper[4721]: E0202 13:02:53.409515 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:53 crc kubenswrapper[4721]: E0202 13:02:53.409577 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:54 crc kubenswrapper[4721]: I0202 13:02:54.409723 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:54 crc kubenswrapper[4721]: E0202 13:02:54.409961 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:55 crc kubenswrapper[4721]: I0202 13:02:55.409720 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:55 crc kubenswrapper[4721]: E0202 13:02:55.409843 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:55 crc kubenswrapper[4721]: I0202 13:02:55.410037 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:55 crc kubenswrapper[4721]: E0202 13:02:55.410132 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:55 crc kubenswrapper[4721]: I0202 13:02:55.410281 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:55 crc kubenswrapper[4721]: E0202 13:02:55.410339 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:56 crc kubenswrapper[4721]: I0202 13:02:56.409202 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:56 crc kubenswrapper[4721]: E0202 13:02:56.409423 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:57 crc kubenswrapper[4721]: I0202 13:02:57.409133 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:57 crc kubenswrapper[4721]: I0202 13:02:57.409143 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:57 crc kubenswrapper[4721]: I0202 13:02:57.409170 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:57 crc kubenswrapper[4721]: E0202 13:02:57.409497 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:57 crc kubenswrapper[4721]: E0202 13:02:57.409567 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:57 crc kubenswrapper[4721]: E0202 13:02:57.409279 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:02:58 crc kubenswrapper[4721]: I0202 13:02:58.409587 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:02:58 crc kubenswrapper[4721]: E0202 13:02:58.409733 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:02:59 crc kubenswrapper[4721]: I0202 13:02:59.408877 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:02:59 crc kubenswrapper[4721]: I0202 13:02:59.408879 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:02:59 crc kubenswrapper[4721]: I0202 13:02:59.408920 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:02:59 crc kubenswrapper[4721]: E0202 13:02:59.409126 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:02:59 crc kubenswrapper[4721]: E0202 13:02:59.409256 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:02:59 crc kubenswrapper[4721]: E0202 13:02:59.409622 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:00 crc kubenswrapper[4721]: I0202 13:03:00.315963 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/1.log" Feb 02 13:03:00 crc kubenswrapper[4721]: I0202 13:03:00.316784 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/0.log" Feb 02 13:03:00 crc kubenswrapper[4721]: I0202 13:03:00.316867 4721 generic.go:334] "Generic (PLEG): container finished" podID="5ba84858-caaa-4fba-8eaf-9f7ddece0b3a" containerID="3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569" exitCode=1 Feb 02 13:03:00 crc kubenswrapper[4721]: I0202 13:03:00.316917 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltw7d" event={"ID":"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a","Type":"ContainerDied","Data":"3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569"} Feb 02 13:03:00 crc kubenswrapper[4721]: I0202 13:03:00.316970 4721 scope.go:117] "RemoveContainer" containerID="0b928ff615c2bf8846d26d69c3c76c248795ed9f57be2d3e90b212ea75abeab6" Feb 02 13:03:00 crc kubenswrapper[4721]: I0202 13:03:00.317587 4721 scope.go:117] "RemoveContainer" containerID="3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569" Feb 02 13:03:00 crc kubenswrapper[4721]: E0202 13:03:00.317911 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-ltw7d_openshift-multus(5ba84858-caaa-4fba-8eaf-9f7ddece0b3a)\"" pod="openshift-multus/multus-ltw7d" podUID="5ba84858-caaa-4fba-8eaf-9f7ddece0b3a" Feb 02 13:03:00 crc kubenswrapper[4721]: E0202 13:03:00.354371 4721 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 02 13:03:00 crc kubenswrapper[4721]: I0202 13:03:00.409551 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:00 crc kubenswrapper[4721]: E0202 13:03:00.410632 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:00 crc kubenswrapper[4721]: E0202 13:03:00.519396 4721 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:03:01 crc kubenswrapper[4721]: I0202 13:03:01.322645 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/1.log" Feb 02 13:03:01 crc kubenswrapper[4721]: I0202 13:03:01.409603 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:01 crc kubenswrapper[4721]: I0202 13:03:01.409637 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:01 crc kubenswrapper[4721]: E0202 13:03:01.410258 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:01 crc kubenswrapper[4721]: I0202 13:03:01.409644 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:01 crc kubenswrapper[4721]: E0202 13:03:01.410332 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:01 crc kubenswrapper[4721]: E0202 13:03:01.410050 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:02 crc kubenswrapper[4721]: I0202 13:03:02.409254 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:02 crc kubenswrapper[4721]: E0202 13:03:02.409426 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:03 crc kubenswrapper[4721]: I0202 13:03:03.409136 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:03 crc kubenswrapper[4721]: I0202 13:03:03.409170 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:03 crc kubenswrapper[4721]: I0202 13:03:03.409134 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:03 crc kubenswrapper[4721]: E0202 13:03:03.409293 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:03 crc kubenswrapper[4721]: E0202 13:03:03.409477 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:03 crc kubenswrapper[4721]: E0202 13:03:03.409611 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:04 crc kubenswrapper[4721]: I0202 13:03:04.409279 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:04 crc kubenswrapper[4721]: E0202 13:03:04.409429 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:04 crc kubenswrapper[4721]: I0202 13:03:04.410556 4721 scope.go:117] "RemoveContainer" containerID="0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29" Feb 02 13:03:05 crc kubenswrapper[4721]: I0202 13:03:05.292538 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xqz79"] Feb 02 13:03:05 crc kubenswrapper[4721]: I0202 13:03:05.292682 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:05 crc kubenswrapper[4721]: E0202 13:03:05.292860 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:05 crc kubenswrapper[4721]: I0202 13:03:05.337607 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/3.log" Feb 02 13:03:05 crc kubenswrapper[4721]: I0202 13:03:05.339927 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerStarted","Data":"3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464"} Feb 02 13:03:05 crc kubenswrapper[4721]: I0202 13:03:05.340375 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:03:05 crc kubenswrapper[4721]: I0202 13:03:05.367774 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podStartSLOduration=100.367757512 podStartE2EDuration="1m40.367757512s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:05.36500301 +0000 UTC m=+125.667517409" watchObservedRunningTime="2026-02-02 13:03:05.367757512 +0000 UTC m=+125.670271901" Feb 02 13:03:05 crc kubenswrapper[4721]: I0202 13:03:05.409591 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:05 crc kubenswrapper[4721]: I0202 13:03:05.409658 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:05 crc kubenswrapper[4721]: E0202 13:03:05.409751 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:05 crc kubenswrapper[4721]: E0202 13:03:05.409831 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:05 crc kubenswrapper[4721]: E0202 13:03:05.520269 4721 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:03:06 crc kubenswrapper[4721]: I0202 13:03:06.409170 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:06 crc kubenswrapper[4721]: E0202 13:03:06.409440 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:07 crc kubenswrapper[4721]: I0202 13:03:07.409141 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:07 crc kubenswrapper[4721]: E0202 13:03:07.409279 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:07 crc kubenswrapper[4721]: I0202 13:03:07.409372 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:07 crc kubenswrapper[4721]: E0202 13:03:07.409431 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:07 crc kubenswrapper[4721]: I0202 13:03:07.409474 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:07 crc kubenswrapper[4721]: E0202 13:03:07.409521 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:08 crc kubenswrapper[4721]: I0202 13:03:08.409707 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:08 crc kubenswrapper[4721]: E0202 13:03:08.409857 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:09 crc kubenswrapper[4721]: I0202 13:03:09.408908 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:09 crc kubenswrapper[4721]: I0202 13:03:09.408972 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:09 crc kubenswrapper[4721]: E0202 13:03:09.409155 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:09 crc kubenswrapper[4721]: I0202 13:03:09.409250 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:09 crc kubenswrapper[4721]: E0202 13:03:09.409338 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:09 crc kubenswrapper[4721]: E0202 13:03:09.409532 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:10 crc kubenswrapper[4721]: I0202 13:03:10.409719 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:10 crc kubenswrapper[4721]: E0202 13:03:10.411468 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:10 crc kubenswrapper[4721]: E0202 13:03:10.521088 4721 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:03:11 crc kubenswrapper[4721]: I0202 13:03:11.409604 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:11 crc kubenswrapper[4721]: I0202 13:03:11.409620 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:11 crc kubenswrapper[4721]: I0202 13:03:11.409620 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:11 crc kubenswrapper[4721]: E0202 13:03:11.410418 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:11 crc kubenswrapper[4721]: E0202 13:03:11.410563 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:11 crc kubenswrapper[4721]: E0202 13:03:11.410703 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:12 crc kubenswrapper[4721]: I0202 13:03:12.409128 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:12 crc kubenswrapper[4721]: E0202 13:03:12.409682 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:13 crc kubenswrapper[4721]: I0202 13:03:13.409261 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:13 crc kubenswrapper[4721]: I0202 13:03:13.409265 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:13 crc kubenswrapper[4721]: I0202 13:03:13.409281 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:13 crc kubenswrapper[4721]: E0202 13:03:13.410412 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:13 crc kubenswrapper[4721]: E0202 13:03:13.410554 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:13 crc kubenswrapper[4721]: E0202 13:03:13.410641 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:14 crc kubenswrapper[4721]: I0202 13:03:14.409680 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:14 crc kubenswrapper[4721]: E0202 13:03:14.410191 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:15 crc kubenswrapper[4721]: I0202 13:03:15.409514 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:15 crc kubenswrapper[4721]: I0202 13:03:15.409545 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:15 crc kubenswrapper[4721]: E0202 13:03:15.409668 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:15 crc kubenswrapper[4721]: I0202 13:03:15.409860 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:15 crc kubenswrapper[4721]: E0202 13:03:15.410037 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:15 crc kubenswrapper[4721]: E0202 13:03:15.410331 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:15 crc kubenswrapper[4721]: I0202 13:03:15.410484 4721 scope.go:117] "RemoveContainer" containerID="3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569" Feb 02 13:03:15 crc kubenswrapper[4721]: E0202 13:03:15.522441 4721 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:03:16 crc kubenswrapper[4721]: I0202 13:03:16.386904 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/1.log" Feb 02 13:03:16 crc kubenswrapper[4721]: I0202 13:03:16.386975 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltw7d" event={"ID":"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a","Type":"ContainerStarted","Data":"c98892a7ff179bcba871f45746b3c85a83090c01da93d13aeaeb2a282472689d"} Feb 02 13:03:16 crc kubenswrapper[4721]: I0202 13:03:16.408896 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:16 crc kubenswrapper[4721]: E0202 13:03:16.409213 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:17 crc kubenswrapper[4721]: I0202 13:03:17.408715 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:17 crc kubenswrapper[4721]: I0202 13:03:17.408715 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:17 crc kubenswrapper[4721]: I0202 13:03:17.408715 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:17 crc kubenswrapper[4721]: E0202 13:03:17.410002 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:17 crc kubenswrapper[4721]: E0202 13:03:17.410153 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:17 crc kubenswrapper[4721]: E0202 13:03:17.410255 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:18 crc kubenswrapper[4721]: I0202 13:03:18.408970 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:18 crc kubenswrapper[4721]: E0202 13:03:18.409186 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:19 crc kubenswrapper[4721]: I0202 13:03:19.409140 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:19 crc kubenswrapper[4721]: I0202 13:03:19.409217 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:19 crc kubenswrapper[4721]: I0202 13:03:19.409148 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:19 crc kubenswrapper[4721]: E0202 13:03:19.409358 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 13:03:19 crc kubenswrapper[4721]: E0202 13:03:19.409482 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 13:03:19 crc kubenswrapper[4721]: E0202 13:03:19.409626 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xqz79" podUID="bfab3ffb-8798-423d-9b55-83868b76a14e" Feb 02 13:03:20 crc kubenswrapper[4721]: I0202 13:03:20.409610 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:20 crc kubenswrapper[4721]: E0202 13:03:20.412145 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 13:03:21 crc kubenswrapper[4721]: I0202 13:03:21.409568 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:21 crc kubenswrapper[4721]: I0202 13:03:21.409613 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:21 crc kubenswrapper[4721]: I0202 13:03:21.410547 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:21 crc kubenswrapper[4721]: I0202 13:03:21.412679 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 02 13:03:21 crc kubenswrapper[4721]: I0202 13:03:21.413260 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 02 13:03:21 crc kubenswrapper[4721]: I0202 13:03:21.416124 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 02 13:03:21 crc kubenswrapper[4721]: I0202 13:03:21.416174 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 02 13:03:22 crc kubenswrapper[4721]: I0202 13:03:22.408765 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:22 crc kubenswrapper[4721]: I0202 13:03:22.412416 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 02 13:03:22 crc kubenswrapper[4721]: I0202 13:03:22.412460 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 02 13:03:22 crc kubenswrapper[4721]: I0202 13:03:22.503649 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.151626 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:27 crc kubenswrapper[4721]: E0202 13:03:27.151957 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:05:29.151914502 +0000 UTC m=+269.454428921 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.252532 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.252584 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.252608 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.252625 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.253567 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.258897 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.258933 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.260000 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.432278 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.455269 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 13:03:27 crc kubenswrapper[4721]: I0202 13:03:27.528972 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 13:03:27 crc kubenswrapper[4721]: W0202 13:03:27.752642 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-0ef6073105612947afbeaf4f24b98a3cbcdec6b1085e2cfb473b600890e39caa WatchSource:0}: Error finding container 0ef6073105612947afbeaf4f24b98a3cbcdec6b1085e2cfb473b600890e39caa: Status 404 returned error can't find the container with id 0ef6073105612947afbeaf4f24b98a3cbcdec6b1085e2cfb473b600890e39caa Feb 02 13:03:27 crc kubenswrapper[4721]: W0202 13:03:27.908307 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-0f194111f9c52c06cbe2c2889a4f609b06af98f099a3cb6173c8e3be0cd624e5 WatchSource:0}: Error finding container 0f194111f9c52c06cbe2c2889a4f609b06af98f099a3cb6173c8e3be0cd624e5: Status 404 returned error can't find the container with id 0f194111f9c52c06cbe2c2889a4f609b06af98f099a3cb6173c8e3be0cd624e5 Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.433018 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4cf4cf9fd34c7d2ec2709b7f856ae03cc4966cf67baa38655e8de98108e5a88f"} Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.434316 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0ef6073105612947afbeaf4f24b98a3cbcdec6b1085e2cfb473b600890e39caa"} Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.434737 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"58e3c3d6723f3a79dcbbbc5d50377f655d347bc71e21de8db1163183dead9e70"} Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.434804 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1afbd55df17a654109cc7be12c213147ebbcdf7cefc4e21d97d802b8fa372095"} Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.436188 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1ce9b05059d8678d782d029736ee06c4951a9a0ae615275995003cdccfb94bcd"} Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.436242 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0f194111f9c52c06cbe2c2889a4f609b06af98f099a3cb6173c8e3be0cd624e5"} Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.436443 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.554860 4721 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.596726 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pnfph"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.597096 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kcw66"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.597490 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.597836 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.598893 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.599674 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.617286 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.617497 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ffkjd"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.617859 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.617989 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.618146 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.618183 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.618824 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.619137 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jrfhj"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.619036 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.619156 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.619203 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.619277 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.620182 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.620346 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.620516 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.620651 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.620567 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.621219 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.621386 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.621480 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.621625 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.621236 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.621913 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.622655 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-2dsnx"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.623232 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.622845 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.624879 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.625250 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.628298 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.628998 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.629638 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.631526 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fqbhq"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.632029 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.634920 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.635432 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f4l8v"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.635868 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.636303 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.636711 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.638503 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.642576 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.642863 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.643049 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.643268 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.644770 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.645090 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.645214 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.645331 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.645710 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.645834 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.645923 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.645986 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.645731 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646043 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646187 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646323 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646351 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646147 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646383 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646452 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646639 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646686 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646742 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.646919 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.649499 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.650129 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.650244 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.650510 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.650689 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.651909 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.652155 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.652334 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.652529 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.652651 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.653849 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zt9ng"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.654273 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.654582 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pnfph"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.654664 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.654952 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zt9ng" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.656751 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.657268 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.657630 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.658452 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.668975 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.669204 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.669317 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.669585 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.669958 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.670767 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-client-ca\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.670893 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64fc7a32-5852-4e03-b1b7-1663f7f52b65-encryption-config\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.670985 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64fc7a32-5852-4e03-b1b7-1663f7f52b65-node-pullsecrets\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.671093 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-etcd-serving-ca\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.671191 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64fc7a32-5852-4e03-b1b7-1663f7f52b65-etcd-client\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.671267 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64fc7a32-5852-4e03-b1b7-1663f7f52b65-serving-cert\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.671343 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2p87\" (UniqueName: \"kubernetes.io/projected/8f1e834f-23b5-42a5-9d13-b9e5720a597c-kube-api-access-w2p87\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.671421 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-image-import-ca\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.671509 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw6dh\" (UniqueName: \"kubernetes.io/projected/083f0d8a-e0c4-46ae-8993-8547dd260553-kube-api-access-jw6dh\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.671611 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1e834f-23b5-42a5-9d13-b9e5720a597c-serving-cert\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.671693 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.671831 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-config\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672095 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6sbb\" (UniqueName: \"kubernetes.io/projected/64fc7a32-5852-4e03-b1b7-1663f7f52b65-kube-api-access-q6sbb\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672182 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-audit\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672263 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-config\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672352 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-client-ca\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672430 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083f0d8a-e0c4-46ae-8993-8547dd260553-config\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672512 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672611 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/083f0d8a-e0c4-46ae-8993-8547dd260553-images\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672690 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/083f0d8a-e0c4-46ae-8993-8547dd260553-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672766 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-config\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672849 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nff5\" (UniqueName: \"kubernetes.io/projected/3c0670a6-888e-40e3-bf5d-82779e70dd1c-kube-api-access-4nff5\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.672928 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64fc7a32-5852-4e03-b1b7-1663f7f52b65-audit-dir\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.673055 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0670a6-888e-40e3-bf5d-82779e70dd1c-serving-cert\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.675812 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wlhhk"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.676819 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.688314 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.688509 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.688716 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.689251 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.689371 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.689468 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.689741 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.689908 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.690449 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.695090 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.699081 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2jzch"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.699477 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zg529"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.700059 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zg529" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.700482 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.690789 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.700523 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.693687 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.713097 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.713227 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.713251 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.713457 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.713094 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.715317 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.715501 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.716389 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-7vhgv"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.716547 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.716806 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.733141 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.733440 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.733683 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.734225 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.734409 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.735307 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.735668 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.736390 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.736645 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.736814 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.737105 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.737545 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.741081 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.742587 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.775722 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.776669 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.776801 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.777635 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.778589 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.778703 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.778787 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.778967 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.779255 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.779294 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.779261 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.779406 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.779477 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.779478 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.779418 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.779633 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.779753 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.780782 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.781943 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.782348 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.782712 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.784704 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.802350 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.802889 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.802919 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.803482 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.803814 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.803993 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.804510 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.806302 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zcf44"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.807620 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-csktx"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809092 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.806340 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809462 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809521 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk95h\" (UniqueName: \"kubernetes.io/projected/202d08f0-f5ea-4414-b2e6-5a690148a823-kube-api-access-zk95h\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809548 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-service-ca\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809575 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809606 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/202d08f0-f5ea-4414-b2e6-5a690148a823-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809634 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f028d7b1-e474-45f8-9c4e-d1b2322175c7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-td8sr\" (UID: \"f028d7b1-e474-45f8-9c4e-d1b2322175c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809680 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/083f0d8a-e0c4-46ae-8993-8547dd260553-images\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809706 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/083f0d8a-e0c4-46ae-8993-8547dd260553-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809732 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-config\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809792 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61358eab-20de-46bb-9701-dc736e6eb5ff-trusted-ca\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809818 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-oauth-serving-cert\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809842 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/202d08f0-f5ea-4414-b2e6-5a690148a823-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809865 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7qwv\" (UniqueName: \"kubernetes.io/projected/70d74e61-4d44-4a6c-8a14-16e131d79e47-kube-api-access-j7qwv\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809887 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-dir\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809918 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nff5\" (UniqueName: \"kubernetes.io/projected/3c0670a6-888e-40e3-bf5d-82779e70dd1c-kube-api-access-4nff5\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809942 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64fc7a32-5852-4e03-b1b7-1663f7f52b65-audit-dir\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809963 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70d74e61-4d44-4a6c-8a14-16e131d79e47-metrics-tls\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.809985 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/67f000c5-5173-44e3-89e6-446c345a6c05-machine-approver-tls\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810013 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810047 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75fc13c2-ccc5-46a0-8a65-d6bc5340baab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2rjrk\" (UID: \"75fc13c2-ccc5-46a0-8a65-d6bc5340baab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810098 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0670a6-888e-40e3-bf5d-82779e70dd1c-serving-cert\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810128 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-client-ca\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810152 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f5kv\" (UniqueName: \"kubernetes.io/projected/a1dde568-291e-40bf-9df7-18cd5449d0aa-kube-api-access-8f5kv\") pod \"dns-operator-744455d44c-zg529\" (UID: \"a1dde568-291e-40bf-9df7-18cd5449d0aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-zg529" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810175 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-policies\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810214 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64fc7a32-5852-4e03-b1b7-1663f7f52b65-encryption-config\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810239 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70d74e61-4d44-4a6c-8a14-16e131d79e47-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810260 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810283 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f028d7b1-e474-45f8-9c4e-d1b2322175c7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-td8sr\" (UID: \"f028d7b1-e474-45f8-9c4e-d1b2322175c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810311 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64fc7a32-5852-4e03-b1b7-1663f7f52b65-node-pullsecrets\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810337 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kth9\" (UniqueName: \"kubernetes.io/projected/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-kube-api-access-9kth9\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810360 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-oauth-config\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810384 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-etcd-serving-ca\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810407 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-console-config\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810432 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64fc7a32-5852-4e03-b1b7-1663f7f52b65-etcd-client\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810453 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64fc7a32-5852-4e03-b1b7-1663f7f52b65-serving-cert\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810482 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2p87\" (UniqueName: \"kubernetes.io/projected/8f1e834f-23b5-42a5-9d13-b9e5720a597c-kube-api-access-w2p87\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810504 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61358eab-20de-46bb-9701-dc736e6eb5ff-serving-cert\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810526 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810551 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-image-import-ca\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810573 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810596 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810626 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw876\" (UniqueName: \"kubernetes.io/projected/75fc13c2-ccc5-46a0-8a65-d6bc5340baab-kube-api-access-zw876\") pod \"openshift-apiserver-operator-796bbdcf4f-2rjrk\" (UID: \"75fc13c2-ccc5-46a0-8a65-d6bc5340baab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810652 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f000c5-5173-44e3-89e6-446c345a6c05-config\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810673 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bbe9190-62bb-4079-afa7-adc9e970eae6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l49ld\" (UID: \"3bbe9190-62bb-4079-afa7-adc9e970eae6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810694 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68hn9\" (UniqueName: \"kubernetes.io/projected/61358eab-20de-46bb-9701-dc736e6eb5ff-kube-api-access-68hn9\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810715 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-serving-cert\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810740 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw6dh\" (UniqueName: \"kubernetes.io/projected/083f0d8a-e0c4-46ae-8993-8547dd260553-kube-api-access-jw6dh\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810764 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zht5s\" (UniqueName: \"kubernetes.io/projected/67f000c5-5173-44e3-89e6-446c345a6c05-kube-api-access-zht5s\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810800 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61358eab-20de-46bb-9701-dc736e6eb5ff-config\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810822 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810850 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1e834f-23b5-42a5-9d13-b9e5720a597c-serving-cert\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810875 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-trusted-ca-bundle\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810902 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810929 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810954 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-config\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.810977 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6sbb\" (UniqueName: \"kubernetes.io/projected/64fc7a32-5852-4e03-b1b7-1663f7f52b65-kube-api-access-q6sbb\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811003 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67f000c5-5173-44e3-89e6-446c345a6c05-auth-proxy-config\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811030 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-audit\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811055 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-serving-cert\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811093 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75fc13c2-ccc5-46a0-8a65-d6bc5340baab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2rjrk\" (UID: \"75fc13c2-ccc5-46a0-8a65-d6bc5340baab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811115 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1dde568-291e-40bf-9df7-18cd5449d0aa-metrics-tls\") pod \"dns-operator-744455d44c-zg529\" (UID: \"a1dde568-291e-40bf-9df7-18cd5449d0aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-zg529" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811139 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxdv2\" (UniqueName: \"kubernetes.io/projected/3bbe9190-62bb-4079-afa7-adc9e970eae6-kube-api-access-vxdv2\") pod \"cluster-samples-operator-665b6dd947-l49ld\" (UID: \"3bbe9190-62bb-4079-afa7-adc9e970eae6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811155 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.808652 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811163 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bwbg\" (UniqueName: \"kubernetes.io/projected/f028d7b1-e474-45f8-9c4e-d1b2322175c7-kube-api-access-9bwbg\") pod \"openshift-controller-manager-operator-756b6f6bc6-td8sr\" (UID: \"f028d7b1-e474-45f8-9c4e-d1b2322175c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811464 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811492 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70d74e61-4d44-4a6c-8a14-16e131d79e47-trusted-ca\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811513 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9h96\" (UniqueName: \"kubernetes.io/projected/ae3f417e-2bae-44dd-973f-5314b6f64972-kube-api-access-w9h96\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811535 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811565 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-config\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811595 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4n2\" (UniqueName: \"kubernetes.io/projected/962524c6-7992-43d5-a7f3-5fdd04297f24-kube-api-access-4l4n2\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811615 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-config\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811635 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-client-ca\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811689 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083f0d8a-e0c4-46ae-8993-8547dd260553-config\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811708 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811727 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811747 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/202d08f0-f5ea-4414-b2e6-5a690148a823-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.811784 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5vj\" (UniqueName: \"kubernetes.io/projected/af02a63f-5e62-47ff-baf5-1dc1e95dc1ad-kube-api-access-lt5vj\") pod \"downloads-7954f5f757-zt9ng\" (UID: \"af02a63f-5e62-47ff-baf5-1dc1e95dc1ad\") " pod="openshift-console/downloads-7954f5f757-zt9ng" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.812132 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-etcd-serving-ca\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.812145 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/083f0d8a-e0c4-46ae-8993-8547dd260553-images\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.813232 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-config\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.813595 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-image-import-ca\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.813652 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64fc7a32-5852-4e03-b1b7-1663f7f52b65-node-pullsecrets\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.815749 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-config\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.816382 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.818789 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/083f0d8a-e0c4-46ae-8993-8547dd260553-config\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.808620 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.819117 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-r47km"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.819194 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-client-ca\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.817680 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/64fc7a32-5852-4e03-b1b7-1663f7f52b65-audit-dir\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.817662 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.819483 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.819860 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c68d5"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.820107 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-config\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.820259 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.820337 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.820468 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.820437 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.818168 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1e834f-23b5-42a5-9d13-b9e5720a597c-serving-cert\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.820545 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-audit\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.820411 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.820911 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.821028 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/083f0d8a-e0c4-46ae-8993-8547dd260553-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.821495 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0670a6-888e-40e3-bf5d-82779e70dd1c-serving-cert\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.821964 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c958f"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.822496 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.822507 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.822875 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.822934 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.823102 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.823320 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2dsnx"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.823704 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-client-ca\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.824506 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jrfhj"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.824619 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64fc7a32-5852-4e03-b1b7-1663f7f52b65-trusted-ca-bundle\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.825567 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.827100 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zt9ng"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.827807 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64fc7a32-5852-4e03-b1b7-1663f7f52b65-serving-cert\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.830639 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fqbhq"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.830809 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ffkjd"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.833828 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/64fc7a32-5852-4e03-b1b7-1663f7f52b65-etcd-client\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.833931 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kcw66"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.833977 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.835079 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.835764 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/64fc7a32-5852-4e03-b1b7-1663f7f52b65-encryption-config\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.839327 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f4l8v"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.839612 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.841455 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.842875 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.844191 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.846263 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kq22p"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.847124 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.847265 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kq22p" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.848960 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wlhhk"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.851188 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-csktx"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.851804 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7gtg4"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.854315 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8t7x8"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.854555 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.854781 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.856681 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kh9ph"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.857664 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.857857 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.859721 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.866632 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2jzch"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.866672 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.871858 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.874325 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.874707 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zcf44"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.877639 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.881941 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.881979 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kh9ph"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.881992 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.884043 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.884210 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.892192 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zg529"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.893161 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.898148 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kq22p"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.899326 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-r47km"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.899828 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.904617 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.906903 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c68d5"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.910209 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914276 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914317 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75fc13c2-ccc5-46a0-8a65-d6bc5340baab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2rjrk\" (UID: \"75fc13c2-ccc5-46a0-8a65-d6bc5340baab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914348 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f5kv\" (UniqueName: \"kubernetes.io/projected/a1dde568-291e-40bf-9df7-18cd5449d0aa-kube-api-access-8f5kv\") pod \"dns-operator-744455d44c-zg529\" (UID: \"a1dde568-291e-40bf-9df7-18cd5449d0aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-zg529" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914371 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-policies\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914392 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70d74e61-4d44-4a6c-8a14-16e131d79e47-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914410 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914430 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f028d7b1-e474-45f8-9c4e-d1b2322175c7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-td8sr\" (UID: \"f028d7b1-e474-45f8-9c4e-d1b2322175c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914449 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kth9\" (UniqueName: \"kubernetes.io/projected/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-kube-api-access-9kth9\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914469 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-oauth-config\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914490 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-console-config\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914520 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61358eab-20de-46bb-9701-dc736e6eb5ff-serving-cert\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914537 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914561 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914587 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914623 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f000c5-5173-44e3-89e6-446c345a6c05-config\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914648 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bbe9190-62bb-4079-afa7-adc9e970eae6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l49ld\" (UID: \"3bbe9190-62bb-4079-afa7-adc9e970eae6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914676 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68hn9\" (UniqueName: \"kubernetes.io/projected/61358eab-20de-46bb-9701-dc736e6eb5ff-kube-api-access-68hn9\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914696 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-serving-cert\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914715 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw876\" (UniqueName: \"kubernetes.io/projected/75fc13c2-ccc5-46a0-8a65-d6bc5340baab-kube-api-access-zw876\") pod \"openshift-apiserver-operator-796bbdcf4f-2rjrk\" (UID: \"75fc13c2-ccc5-46a0-8a65-d6bc5340baab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914740 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zht5s\" (UniqueName: \"kubernetes.io/projected/67f000c5-5173-44e3-89e6-446c345a6c05-kube-api-access-zht5s\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914768 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61358eab-20de-46bb-9701-dc736e6eb5ff-config\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914786 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914809 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-trusted-ca-bundle\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914836 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914874 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67f000c5-5173-44e3-89e6-446c345a6c05-auth-proxy-config\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914903 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-serving-cert\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914925 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75fc13c2-ccc5-46a0-8a65-d6bc5340baab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2rjrk\" (UID: \"75fc13c2-ccc5-46a0-8a65-d6bc5340baab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914944 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1dde568-291e-40bf-9df7-18cd5449d0aa-metrics-tls\") pod \"dns-operator-744455d44c-zg529\" (UID: \"a1dde568-291e-40bf-9df7-18cd5449d0aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-zg529" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914965 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bwbg\" (UniqueName: \"kubernetes.io/projected/f028d7b1-e474-45f8-9c4e-d1b2322175c7-kube-api-access-9bwbg\") pod \"openshift-controller-manager-operator-756b6f6bc6-td8sr\" (UID: \"f028d7b1-e474-45f8-9c4e-d1b2322175c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.914985 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915005 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxdv2\" (UniqueName: \"kubernetes.io/projected/3bbe9190-62bb-4079-afa7-adc9e970eae6-kube-api-access-vxdv2\") pod \"cluster-samples-operator-665b6dd947-l49ld\" (UID: \"3bbe9190-62bb-4079-afa7-adc9e970eae6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915030 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9h96\" (UniqueName: \"kubernetes.io/projected/ae3f417e-2bae-44dd-973f-5314b6f64972-kube-api-access-w9h96\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915049 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915084 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70d74e61-4d44-4a6c-8a14-16e131d79e47-trusted-ca\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915106 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4n2\" (UniqueName: \"kubernetes.io/projected/962524c6-7992-43d5-a7f3-5fdd04297f24-kube-api-access-4l4n2\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915123 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-config\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915150 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/202d08f0-f5ea-4414-b2e6-5a690148a823-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915168 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5vj\" (UniqueName: \"kubernetes.io/projected/af02a63f-5e62-47ff-baf5-1dc1e95dc1ad-kube-api-access-lt5vj\") pod \"downloads-7954f5f757-zt9ng\" (UID: \"af02a63f-5e62-47ff-baf5-1dc1e95dc1ad\") " pod="openshift-console/downloads-7954f5f757-zt9ng" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915186 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915207 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915238 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915257 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk95h\" (UniqueName: \"kubernetes.io/projected/202d08f0-f5ea-4414-b2e6-5a690148a823-kube-api-access-zk95h\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915274 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-service-ca\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915295 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915314 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/202d08f0-f5ea-4414-b2e6-5a690148a823-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915329 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f028d7b1-e474-45f8-9c4e-d1b2322175c7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-td8sr\" (UID: \"f028d7b1-e474-45f8-9c4e-d1b2322175c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915355 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-oauth-serving-cert\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915371 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/202d08f0-f5ea-4414-b2e6-5a690148a823-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915395 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7qwv\" (UniqueName: \"kubernetes.io/projected/70d74e61-4d44-4a6c-8a14-16e131d79e47-kube-api-access-j7qwv\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915410 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61358eab-20de-46bb-9701-dc736e6eb5ff-trusted-ca\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915427 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-dir\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915451 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70d74e61-4d44-4a6c-8a14-16e131d79e47-metrics-tls\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.915467 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/67f000c5-5173-44e3-89e6-446c345a6c05-machine-approver-tls\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.916338 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-service-ca-bundle\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.916911 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75fc13c2-ccc5-46a0-8a65-d6bc5340baab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2rjrk\" (UID: \"75fc13c2-ccc5-46a0-8a65-d6bc5340baab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.917831 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-policies\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.918936 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c958f"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.919184 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7gtg4"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.919880 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.920456 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg"] Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.921609 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/202d08f0-f5ea-4414-b2e6-5a690148a823-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.922145 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-config\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.924192 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.925359 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-service-ca\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.926427 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.928440 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.928216 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61358eab-20de-46bb-9701-dc736e6eb5ff-config\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.928770 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-serving-cert\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.929320 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.929922 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-dir\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.929926 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-oauth-serving-cert\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.930189 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67f000c5-5173-44e3-89e6-446c345a6c05-auth-proxy-config\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.930438 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.930447 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f000c5-5173-44e3-89e6-446c345a6c05-config\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.930456 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.930966 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61358eab-20de-46bb-9701-dc736e6eb5ff-trusted-ca\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.932191 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.932474 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-console-config\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.932480 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f028d7b1-e474-45f8-9c4e-d1b2322175c7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-td8sr\" (UID: \"f028d7b1-e474-45f8-9c4e-d1b2322175c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.932626 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.933316 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.933622 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-trusted-ca-bundle\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.934455 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-serving-cert\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.934498 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f028d7b1-e474-45f8-9c4e-d1b2322175c7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-td8sr\" (UID: \"f028d7b1-e474-45f8-9c4e-d1b2322175c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.934583 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.934752 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bbe9190-62bb-4079-afa7-adc9e970eae6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l49ld\" (UID: \"3bbe9190-62bb-4079-afa7-adc9e970eae6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.934907 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.934996 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a1dde568-291e-40bf-9df7-18cd5449d0aa-metrics-tls\") pod \"dns-operator-744455d44c-zg529\" (UID: \"a1dde568-291e-40bf-9df7-18cd5449d0aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-zg529" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.936014 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/67f000c5-5173-44e3-89e6-446c345a6c05-machine-approver-tls\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.936965 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.937349 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75fc13c2-ccc5-46a0-8a65-d6bc5340baab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2rjrk\" (UID: \"75fc13c2-ccc5-46a0-8a65-d6bc5340baab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.937869 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/202d08f0-f5ea-4414-b2e6-5a690148a823-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.938672 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-oauth-config\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.938772 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61358eab-20de-46bb-9701-dc736e6eb5ff-serving-cert\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.940558 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.960181 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.980273 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 02 13:03:28 crc kubenswrapper[4721]: I0202 13:03:28.999709 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.020370 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.039791 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.060856 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.079597 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.084671 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/70d74e61-4d44-4a6c-8a14-16e131d79e47-metrics-tls\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.107449 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.113660 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70d74e61-4d44-4a6c-8a14-16e131d79e47-trusted-ca\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.120325 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.140043 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.161012 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.181574 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.199397 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.221149 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.240523 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.261646 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.320629 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.340731 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.360563 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.381332 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.400252 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.419769 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.440796 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.460580 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.481407 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.501144 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.519901 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.540736 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.560009 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.580441 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.599684 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.619737 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.641939 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.660977 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.681627 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.701209 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.720627 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.741230 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.760930 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.782359 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.801029 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.818879 4721 request.go:700] Waited for 1.006878782s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dservice-ca-operator-dockercfg-rg9jl&limit=500&resourceVersion=0 Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.822417 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.841201 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.860919 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.882398 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.935535 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2p87\" (UniqueName: \"kubernetes.io/projected/8f1e834f-23b5-42a5-9d13-b9e5720a597c-kube-api-access-w2p87\") pod \"route-controller-manager-6576b87f9c-bg49f\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.946401 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6sbb\" (UniqueName: \"kubernetes.io/projected/64fc7a32-5852-4e03-b1b7-1663f7f52b65-kube-api-access-q6sbb\") pod \"apiserver-76f77b778f-kcw66\" (UID: \"64fc7a32-5852-4e03-b1b7-1663f7f52b65\") " pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.965124 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nff5\" (UniqueName: \"kubernetes.io/projected/3c0670a6-888e-40e3-bf5d-82779e70dd1c-kube-api-access-4nff5\") pod \"controller-manager-879f6c89f-ffkjd\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.980747 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw6dh\" (UniqueName: \"kubernetes.io/projected/083f0d8a-e0c4-46ae-8993-8547dd260553-kube-api-access-jw6dh\") pod \"machine-api-operator-5694c8668f-pnfph\" (UID: \"083f0d8a-e0c4-46ae-8993-8547dd260553\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:29 crc kubenswrapper[4721]: I0202 13:03:29.995209 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.001014 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.020100 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.040925 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.060958 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.080195 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.100678 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.120922 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.135372 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.141453 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.147883 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.162032 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.174674 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.181514 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.198566 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.201998 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.221823 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.241508 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.261280 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.282407 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.300988 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.321287 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.341435 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.361113 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.381026 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.384955 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pnfph"] Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.395554 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f"] Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.400636 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.420166 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.440658 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.447755 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-kcw66"] Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.450268 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" event={"ID":"8f1e834f-23b5-42a5-9d13-b9e5720a597c","Type":"ContainerStarted","Data":"9dac14241b7592e3b43fe2d27aa1874f518d588eab3c2210074f031e8ca8e1b4"} Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.452834 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" event={"ID":"083f0d8a-e0c4-46ae-8993-8547dd260553","Type":"ContainerStarted","Data":"34ef692467272d3aee99c0d5eab9b2ce3532c1d3b07684dcfeed9326b63792ca"} Feb 02 13:03:30 crc kubenswrapper[4721]: W0202 13:03:30.454190 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64fc7a32_5852_4e03_b1b7_1663f7f52b65.slice/crio-6433b629d1bde4bb980befd60ac6efea557893fc4cea82b8bd13eb528ae2b9a2 WatchSource:0}: Error finding container 6433b629d1bde4bb980befd60ac6efea557893fc4cea82b8bd13eb528ae2b9a2: Status 404 returned error can't find the container with id 6433b629d1bde4bb980befd60ac6efea557893fc4cea82b8bd13eb528ae2b9a2 Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.459356 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.465863 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ffkjd"] Feb 02 13:03:30 crc kubenswrapper[4721]: W0202 13:03:30.474436 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c0670a6_888e_40e3_bf5d_82779e70dd1c.slice/crio-167d1cdbeb24f93927ece3e3fa3df789c23a8308344e8f29012657e06e53e904 WatchSource:0}: Error finding container 167d1cdbeb24f93927ece3e3fa3df789c23a8308344e8f29012657e06e53e904: Status 404 returned error can't find the container with id 167d1cdbeb24f93927ece3e3fa3df789c23a8308344e8f29012657e06e53e904 Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.480931 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.501268 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.520352 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.539870 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.559934 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.580639 4721 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.601485 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.620770 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.640197 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.660546 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.680885 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.700726 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.720272 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.757685 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f5kv\" (UniqueName: \"kubernetes.io/projected/a1dde568-291e-40bf-9df7-18cd5449d0aa-kube-api-access-8f5kv\") pod \"dns-operator-744455d44c-zg529\" (UID: \"a1dde568-291e-40bf-9df7-18cd5449d0aa\") " pod="openshift-dns-operator/dns-operator-744455d44c-zg529" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.776092 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70d74e61-4d44-4a6c-8a14-16e131d79e47-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.795359 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk95h\" (UniqueName: \"kubernetes.io/projected/202d08f0-f5ea-4414-b2e6-5a690148a823-kube-api-access-zk95h\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.814195 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxdv2\" (UniqueName: \"kubernetes.io/projected/3bbe9190-62bb-4079-afa7-adc9e970eae6-kube-api-access-vxdv2\") pod \"cluster-samples-operator-665b6dd947-l49ld\" (UID: \"3bbe9190-62bb-4079-afa7-adc9e970eae6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.819089 4721 request.go:700] Waited for 1.899749954s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.834581 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9h96\" (UniqueName: \"kubernetes.io/projected/ae3f417e-2bae-44dd-973f-5314b6f64972-kube-api-access-w9h96\") pod \"console-f9d7485db-2dsnx\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.852589 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4n2\" (UniqueName: \"kubernetes.io/projected/962524c6-7992-43d5-a7f3-5fdd04297f24-kube-api-access-4l4n2\") pod \"oauth-openshift-558db77b4-fqbhq\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.872591 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/202d08f0-f5ea-4414-b2e6-5a690148a823-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pvz9q\" (UID: \"202d08f0-f5ea-4414-b2e6-5a690148a823\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.893101 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5vj\" (UniqueName: \"kubernetes.io/projected/af02a63f-5e62-47ff-baf5-1dc1e95dc1ad-kube-api-access-lt5vj\") pod \"downloads-7954f5f757-zt9ng\" (UID: \"af02a63f-5e62-47ff-baf5-1dc1e95dc1ad\") " pod="openshift-console/downloads-7954f5f757-zt9ng" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.913124 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.917817 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw876\" (UniqueName: \"kubernetes.io/projected/75fc13c2-ccc5-46a0-8a65-d6bc5340baab-kube-api-access-zw876\") pod \"openshift-apiserver-operator-796bbdcf4f-2rjrk\" (UID: \"75fc13c2-ccc5-46a0-8a65-d6bc5340baab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.938226 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zht5s\" (UniqueName: \"kubernetes.io/projected/67f000c5-5173-44e3-89e6-446c345a6c05-kube-api-access-zht5s\") pod \"machine-approver-56656f9798-xbv2j\" (UID: \"67f000c5-5173-44e3-89e6-446c345a6c05\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.957408 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7qwv\" (UniqueName: \"kubernetes.io/projected/70d74e61-4d44-4a6c-8a14-16e131d79e47-kube-api-access-j7qwv\") pod \"ingress-operator-5b745b69d9-z25fz\" (UID: \"70d74e61-4d44-4a6c-8a14-16e131d79e47\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.960624 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.978284 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bwbg\" (UniqueName: \"kubernetes.io/projected/f028d7b1-e474-45f8-9c4e-d1b2322175c7-kube-api-access-9bwbg\") pod \"openshift-controller-manager-operator-756b6f6bc6-td8sr\" (UID: \"f028d7b1-e474-45f8-9c4e-d1b2322175c7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:30 crc kubenswrapper[4721]: I0202 13:03:30.984600 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:30.996813 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.004349 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zt9ng" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.008955 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kth9\" (UniqueName: \"kubernetes.io/projected/4a636bbb-70b8-4b2a-96c6-94f9edba40cc-kube-api-access-9kth9\") pod \"authentication-operator-69f744f599-jrfhj\" (UID: \"4a636bbb-70b8-4b2a-96c6-94f9edba40cc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.016922 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68hn9\" (UniqueName: \"kubernetes.io/projected/61358eab-20de-46bb-9701-dc736e6eb5ff-kube-api-access-68hn9\") pod \"console-operator-58897d9998-f4l8v\" (UID: \"61358eab-20de-46bb-9701-dc736e6eb5ff\") " pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.026308 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zg529" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040155 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjxt2\" (UniqueName: \"kubernetes.io/projected/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-kube-api-access-tjxt2\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040234 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-etcd-client\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040266 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab4249b9-1751-45d6-be3f-58668c4542bd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040337 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a74f12-1bed-4744-9dec-57282d5301eb-serving-cert\") pod \"openshift-config-operator-7777fb866f-fgg7h\" (UID: \"f2a74f12-1bed-4744-9dec-57282d5301eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040359 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ab4249b9-1751-45d6-be3f-58668c4542bd-audit-dir\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040408 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-default-certificate\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040434 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-service-ca-bundle\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040483 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-stats-auth\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040560 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ab4249b9-1751-45d6-be3f-58668c4542bd-audit-policies\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040608 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-serving-cert\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040662 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5f7f80a-15ef-47b9-9e1e-325066df7897-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040685 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4249b9-1751-45d6-be3f-58668c4542bd-serving-cert\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040734 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccp75\" (UniqueName: \"kubernetes.io/projected/ab4249b9-1751-45d6-be3f-58668c4542bd-kube-api-access-ccp75\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040803 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-tls\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040833 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f2a74f12-1bed-4744-9dec-57282d5301eb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fgg7h\" (UID: \"f2a74f12-1bed-4744-9dec-57282d5301eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040857 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-etcd-ca\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040919 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040945 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5f7f80a-15ef-47b9-9e1e-325066df7897-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040970 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-config\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.040995 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-certificates\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041017 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-etcd-service-ca\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041048 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ab4249b9-1751-45d6-be3f-58668c4542bd-encryption-config\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041153 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-trusted-ca\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041214 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpq2\" (UniqueName: \"kubernetes.io/projected/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-kube-api-access-rlpq2\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041238 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab4249b9-1751-45d6-be3f-58668c4542bd-etcd-client\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041261 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-bound-sa-token\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041284 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwm4n\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-kube-api-access-rwm4n\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041312 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ab4249b9-1751-45d6-be3f-58668c4542bd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041346 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-metrics-certs\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.041381 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds87p\" (UniqueName: \"kubernetes.io/projected/f2a74f12-1bed-4744-9dec-57282d5301eb-kube-api-access-ds87p\") pod \"openshift-config-operator-7777fb866f-fgg7h\" (UID: \"f2a74f12-1bed-4744-9dec-57282d5301eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.044217 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:31.544202177 +0000 UTC m=+151.846716566 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.059228 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.082125 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.112183 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.126548 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.126729 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fqbhq"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.131659 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142177 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142474 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/46f85b66-5a30-4bef-909c-26750b18e72d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdzwk\" (UID: \"46f85b66-5a30-4bef-909c-26750b18e72d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.142545 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:31.642523692 +0000 UTC m=+151.945038161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142577 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-mountpoint-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142628 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/911d6570-6a65-42b8-a562-3e1ccdc8d562-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142648 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-socket-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142674 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/63b0f1ec-c2d9-4005-ba10-839949dbbcac-node-bootstrap-token\") pod \"machine-config-server-8t7x8\" (UID: \"63b0f1ec-c2d9-4005-ba10-839949dbbcac\") " pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142695 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w84l4\" (UniqueName: \"kubernetes.io/projected/ab926544-a708-445a-aaf9-0e3ad4593676-kube-api-access-w84l4\") pod \"service-ca-operator-777779d784-csktx\" (UID: \"ab926544-a708-445a-aaf9-0e3ad4593676\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142753 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ab4249b9-1751-45d6-be3f-58668c4542bd-encryption-config\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142788 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-trusted-ca\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142805 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab4249b9-1751-45d6-be3f-58668c4542bd-etcd-client\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142823 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4sll\" (UniqueName: \"kubernetes.io/projected/63b0f1ec-c2d9-4005-ba10-839949dbbcac-kube-api-access-n4sll\") pod \"machine-config-server-8t7x8\" (UID: \"63b0f1ec-c2d9-4005-ba10-839949dbbcac\") " pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142840 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7grx\" (UniqueName: \"kubernetes.io/projected/6104d27e-fefa-4e2a-9b9e-62013c96f664-kube-api-access-m7grx\") pod \"multus-admission-controller-857f4d67dd-c68d5\" (UID: \"6104d27e-fefa-4e2a-9b9e-62013c96f664\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142869 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-bound-sa-token\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142897 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2de0eb97-f51f-4468-9a68-eb9d6a7ce40d-proxy-tls\") pod \"machine-config-controller-84d6567774-f5gbr\" (UID: \"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.142926 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/625494cc-c7b0-4a0a-811c-d4822b1c0acc-config-volume\") pod \"dns-default-kh9ph\" (UID: \"625494cc-c7b0-4a0a-811c-d4822b1c0acc\") " pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.143719 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a98572f-0fab-4dab-9935-6bf52cdc7fff-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wdm7\" (UID: \"0a98572f-0fab-4dab-9935-6bf52cdc7fff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144296 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab926544-a708-445a-aaf9-0e3ad4593676-serving-cert\") pod \"service-ca-operator-777779d784-csktx\" (UID: \"ab926544-a708-445a-aaf9-0e3ad4593676\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144327 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ab4249b9-1751-45d6-be3f-58668c4542bd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144354 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a1d56e-00d0-4e88-bdfb-461578e327e6-config\") pod \"kube-controller-manager-operator-78b949d7b-w9gmp\" (UID: \"a0a1d56e-00d0-4e88-bdfb-461578e327e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144393 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53727b9f-9f5f-4f6e-8fa2-a6018c8225f5-cert\") pod \"ingress-canary-kq22p\" (UID: \"53727b9f-9f5f-4f6e-8fa2-a6018c8225f5\") " pod="openshift-ingress-canary/ingress-canary-kq22p" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144417 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6104d27e-fefa-4e2a-9b9e-62013c96f664-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c68d5\" (UID: \"6104d27e-fefa-4e2a-9b9e-62013c96f664\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144444 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds87p\" (UniqueName: \"kubernetes.io/projected/f2a74f12-1bed-4744-9dec-57282d5301eb-kube-api-access-ds87p\") pod \"openshift-config-operator-7777fb866f-fgg7h\" (UID: \"f2a74f12-1bed-4744-9dec-57282d5301eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144471 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-plugins-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144529 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-registration-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144559 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2de0eb97-f51f-4468-9a68-eb9d6a7ce40d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f5gbr\" (UID: \"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144587 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0a1d56e-00d0-4e88-bdfb-461578e327e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-w9gmp\" (UID: \"a0a1d56e-00d0-4e88-bdfb-461578e327e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144627 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-config-volume\") pod \"collect-profiles-29500620-rxhcg\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144654 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wj4v\" (UniqueName: \"kubernetes.io/projected/46f85b66-5a30-4bef-909c-26750b18e72d-kube-api-access-5wj4v\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdzwk\" (UID: \"46f85b66-5a30-4bef-909c-26750b18e72d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144680 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vzjw\" (UniqueName: \"kubernetes.io/projected/911d6570-6a65-42b8-a562-3e1ccdc8d562-kube-api-access-9vzjw\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144703 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrsnv\" (UniqueName: \"kubernetes.io/projected/c5b5487e-8a60-4967-b0f3-1d983c559f8a-kube-api-access-zrsnv\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.144756 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab4249b9-1751-45d6-be3f-58668c4542bd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145197 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ab4249b9-1751-45d6-be3f-58668c4542bd-audit-dir\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145236 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjb78\" (UniqueName: \"kubernetes.io/projected/cb2244ea-f203-4f66-9a4d-aad5e58a5c46-kube-api-access-gjb78\") pod \"catalog-operator-68c6474976-bz9nm\" (UID: \"cb2244ea-f203-4f66-9a4d-aad5e58a5c46\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145262 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b31753-6a52-4364-b01f-9d50aeac7c13-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-78nss\" (UID: \"32b31753-6a52-4364-b01f-9d50aeac7c13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145289 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-default-certificate\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145312 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-service-ca-bundle\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145336 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-stats-auth\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145359 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab926544-a708-445a-aaf9-0e3ad4593676-config\") pod \"service-ca-operator-777779d784-csktx\" (UID: \"ab926544-a708-445a-aaf9-0e3ad4593676\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145382 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ab4249b9-1751-45d6-be3f-58668c4542bd-audit-policies\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145403 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnzzr\" (UniqueName: \"kubernetes.io/projected/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-kube-api-access-lnzzr\") pod \"collect-profiles-29500620-rxhcg\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145426 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hvpn\" (UniqueName: \"kubernetes.io/projected/53727b9f-9f5f-4f6e-8fa2-a6018c8225f5-kube-api-access-6hvpn\") pod \"ingress-canary-kq22p\" (UID: \"53727b9f-9f5f-4f6e-8fa2-a6018c8225f5\") " pod="openshift-ingress-canary/ingress-canary-kq22p" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145451 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cb2244ea-f203-4f66-9a4d-aad5e58a5c46-profile-collector-cert\") pod \"catalog-operator-68c6474976-bz9nm\" (UID: \"cb2244ea-f203-4f66-9a4d-aad5e58a5c46\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145475 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-serving-cert\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145498 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/debfaa13-5820-4570-a447-8ef48903144c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2wzzt\" (UID: \"debfaa13-5820-4570-a447-8ef48903144c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145523 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5f7f80a-15ef-47b9-9e1e-325066df7897-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145562 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b31753-6a52-4364-b01f-9d50aeac7c13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-78nss\" (UID: \"32b31753-6a52-4364-b01f-9d50aeac7c13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145591 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-tls\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145615 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-secret-volume\") pod \"collect-profiles-29500620-rxhcg\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145638 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn92f\" (UniqueName: \"kubernetes.io/projected/2c9074bc-889d-4ce7-a250-6fc5984703e0-kube-api-access-xn92f\") pod \"marketplace-operator-79b997595-zcf44\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145666 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b06f9eba-0e3d-47fb-a386-a166987e78fd-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hlnsv\" (UID: \"b06f9eba-0e3d-47fb-a386-a166987e78fd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145689 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0a1d56e-00d0-4e88-bdfb-461578e327e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-w9gmp\" (UID: \"a0a1d56e-00d0-4e88-bdfb-461578e327e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145715 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b31753-6a52-4364-b01f-9d50aeac7c13-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-78nss\" (UID: \"32b31753-6a52-4364-b01f-9d50aeac7c13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145753 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a98572f-0fab-4dab-9935-6bf52cdc7fff-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wdm7\" (UID: \"0a98572f-0fab-4dab-9935-6bf52cdc7fff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145777 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zflp4\" (UniqueName: \"kubernetes.io/projected/625494cc-c7b0-4a0a-811c-d4822b1c0acc-kube-api-access-zflp4\") pod \"dns-default-kh9ph\" (UID: \"625494cc-c7b0-4a0a-811c-d4822b1c0acc\") " pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145802 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5f7f80a-15ef-47b9-9e1e-325066df7897-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145826 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qsg\" (UniqueName: \"kubernetes.io/projected/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-kube-api-access-p2qsg\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145865 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-config\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145940 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-certificates\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145965 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5b5487e-8a60-4967-b0f3-1d983c559f8a-webhook-cert\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.145988 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/911d6570-6a65-42b8-a562-3e1ccdc8d562-proxy-tls\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146010 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc-config\") pod \"kube-apiserver-operator-766d6c64bb-mkplf\" (UID: \"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146051 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-etcd-service-ca\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146091 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b06f9eba-0e3d-47fb-a386-a166987e78fd-srv-cert\") pod \"olm-operator-6b444d44fb-hlnsv\" (UID: \"b06f9eba-0e3d-47fb-a386-a166987e78fd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146118 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpq2\" (UniqueName: \"kubernetes.io/projected/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-kube-api-access-rlpq2\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146158 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/37b1d658-0b12-4afd-9b42-9f54f553d432-signing-cabundle\") pod \"service-ca-9c57cc56f-r47km\" (UID: \"37b1d658-0b12-4afd-9b42-9f54f553d432\") " pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146201 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwm4n\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-kube-api-access-rwm4n\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146229 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mkplf\" (UID: \"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146255 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkxj7\" (UniqueName: \"kubernetes.io/projected/bace5e2a-2c1a-433c-bc00-9121a45aa515-kube-api-access-jkxj7\") pod \"migrator-59844c95c7-qr8r7\" (UID: \"bace5e2a-2c1a-433c-bc00-9121a45aa515\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146316 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lbkx\" (UniqueName: \"kubernetes.io/projected/2de0eb97-f51f-4468-9a68-eb9d6a7ce40d-kube-api-access-4lbkx\") pod \"machine-config-controller-84d6567774-f5gbr\" (UID: \"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.146422 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-trusted-ca\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147031 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/63b0f1ec-c2d9-4005-ba10-839949dbbcac-certs\") pod \"machine-config-server-8t7x8\" (UID: \"63b0f1ec-c2d9-4005-ba10-839949dbbcac\") " pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147166 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5b5487e-8a60-4967-b0f3-1d983c559f8a-tmpfs\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147189 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5x7n\" (UniqueName: \"kubernetes.io/projected/37b1d658-0b12-4afd-9b42-9f54f553d432-kube-api-access-s5x7n\") pod \"service-ca-9c57cc56f-r47km\" (UID: \"37b1d658-0b12-4afd-9b42-9f54f553d432\") " pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147212 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-metrics-certs\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147230 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5b5487e-8a60-4967-b0f3-1d983c559f8a-apiservice-cert\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147305 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bksrl\" (UniqueName: \"kubernetes.io/projected/debfaa13-5820-4570-a447-8ef48903144c-kube-api-access-bksrl\") pod \"package-server-manager-789f6589d5-2wzzt\" (UID: \"debfaa13-5820-4570-a447-8ef48903144c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147324 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/625494cc-c7b0-4a0a-811c-d4822b1c0acc-metrics-tls\") pod \"dns-default-kh9ph\" (UID: \"625494cc-c7b0-4a0a-811c-d4822b1c0acc\") " pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147367 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjxt2\" (UniqueName: \"kubernetes.io/projected/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-kube-api-access-tjxt2\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147587 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5f7f80a-15ef-47b9-9e1e-325066df7897-ca-trust-extracted\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147823 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-etcd-client\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.147855 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkvfb\" (UniqueName: \"kubernetes.io/projected/0a98572f-0fab-4dab-9935-6bf52cdc7fff-kube-api-access-hkvfb\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wdm7\" (UID: \"0a98572f-0fab-4dab-9935-6bf52cdc7fff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154306 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/911d6570-6a65-42b8-a562-3e1ccdc8d562-images\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154337 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zcf44\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154369 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/37b1d658-0b12-4afd-9b42-9f54f553d432-signing-key\") pod \"service-ca-9c57cc56f-r47km\" (UID: \"37b1d658-0b12-4afd-9b42-9f54f553d432\") " pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154414 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a74f12-1bed-4744-9dec-57282d5301eb-serving-cert\") pod \"openshift-config-operator-7777fb866f-fgg7h\" (UID: \"f2a74f12-1bed-4744-9dec-57282d5301eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154445 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsw9p\" (UniqueName: \"kubernetes.io/projected/b06f9eba-0e3d-47fb-a386-a166987e78fd-kube-api-access-hsw9p\") pod \"olm-operator-6b444d44fb-hlnsv\" (UID: \"b06f9eba-0e3d-47fb-a386-a166987e78fd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154488 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccp75\" (UniqueName: \"kubernetes.io/projected/ab4249b9-1751-45d6-be3f-58668c4542bd-kube-api-access-ccp75\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154509 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4249b9-1751-45d6-be3f-58668c4542bd-serving-cert\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154538 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cb2244ea-f203-4f66-9a4d-aad5e58a5c46-srv-cert\") pod \"catalog-operator-68c6474976-bz9nm\" (UID: \"cb2244ea-f203-4f66-9a4d-aad5e58a5c46\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154565 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f2a74f12-1bed-4744-9dec-57282d5301eb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fgg7h\" (UID: \"f2a74f12-1bed-4744-9dec-57282d5301eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154725 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-stats-auth\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.149423 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ab4249b9-1751-45d6-be3f-58668c4542bd-audit-policies\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.148641 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab4249b9-1751-45d6-be3f-58668c4542bd-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.149833 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-etcd-service-ca\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.148978 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ab4249b9-1751-45d6-be3f-58668c4542bd-audit-dir\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.148762 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-service-ca-bundle\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.150256 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ab4249b9-1751-45d6-be3f-58668c4542bd-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.148779 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-config\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.149776 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-certificates\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.152386 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-metrics-certs\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.154059 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab4249b9-1751-45d6-be3f-58668c4542bd-etcd-client\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.155974 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/f2a74f12-1bed-4744-9dec-57282d5301eb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fgg7h\" (UID: \"f2a74f12-1bed-4744-9dec-57282d5301eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.157327 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-serving-cert\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.158237 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4249b9-1751-45d6-be3f-58668c4542bd-serving-cert\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.158446 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-etcd-client\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.159305 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zcf44\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.159363 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-etcd-ca\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.159385 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mkplf\" (UID: \"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.159407 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-csi-data-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.159588 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ab4249b9-1751-45d6-be3f-58668c4542bd-encryption-config\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.160276 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-etcd-ca\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.160419 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-default-certificate\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.160909 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-tls\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.160920 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5f7f80a-15ef-47b9-9e1e-325066df7897-installation-pull-secrets\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.163708 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a74f12-1bed-4744-9dec-57282d5301eb-serving-cert\") pod \"openshift-config-operator-7777fb866f-fgg7h\" (UID: \"f2a74f12-1bed-4744-9dec-57282d5301eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.218408 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-bound-sa-token\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.224457 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwm4n\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-kube-api-access-rwm4n\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.237619 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpq2\" (UniqueName: \"kubernetes.io/projected/4a107c9e-3f0e-4d4b-8b02-65bda6793a2d-kube-api-access-rlpq2\") pod \"etcd-operator-b45778765-2jzch\" (UID: \"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.271542 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds87p\" (UniqueName: \"kubernetes.io/projected/f2a74f12-1bed-4744-9dec-57282d5301eb-kube-api-access-ds87p\") pod \"openshift-config-operator-7777fb866f-fgg7h\" (UID: \"f2a74f12-1bed-4744-9dec-57282d5301eb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.271885 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.272940 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkvfb\" (UniqueName: \"kubernetes.io/projected/0a98572f-0fab-4dab-9935-6bf52cdc7fff-kube-api-access-hkvfb\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wdm7\" (UID: \"0a98572f-0fab-4dab-9935-6bf52cdc7fff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.272969 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/911d6570-6a65-42b8-a562-3e1ccdc8d562-images\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273006 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zcf44\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273025 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/37b1d658-0b12-4afd-9b42-9f54f553d432-signing-key\") pod \"service-ca-9c57cc56f-r47km\" (UID: \"37b1d658-0b12-4afd-9b42-9f54f553d432\") " pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273044 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsw9p\" (UniqueName: \"kubernetes.io/projected/b06f9eba-0e3d-47fb-a386-a166987e78fd-kube-api-access-hsw9p\") pod \"olm-operator-6b444d44fb-hlnsv\" (UID: \"b06f9eba-0e3d-47fb-a386-a166987e78fd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273122 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cb2244ea-f203-4f66-9a4d-aad5e58a5c46-srv-cert\") pod \"catalog-operator-68c6474976-bz9nm\" (UID: \"cb2244ea-f203-4f66-9a4d-aad5e58a5c46\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273141 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zcf44\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273162 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mkplf\" (UID: \"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273200 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-csi-data-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273223 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273243 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/46f85b66-5a30-4bef-909c-26750b18e72d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdzwk\" (UID: \"46f85b66-5a30-4bef-909c-26750b18e72d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273294 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-mountpoint-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273315 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/911d6570-6a65-42b8-a562-3e1ccdc8d562-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273849 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-socket-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273880 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/63b0f1ec-c2d9-4005-ba10-839949dbbcac-node-bootstrap-token\") pod \"machine-config-server-8t7x8\" (UID: \"63b0f1ec-c2d9-4005-ba10-839949dbbcac\") " pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273927 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w84l4\" (UniqueName: \"kubernetes.io/projected/ab926544-a708-445a-aaf9-0e3ad4593676-kube-api-access-w84l4\") pod \"service-ca-operator-777779d784-csktx\" (UID: \"ab926544-a708-445a-aaf9-0e3ad4593676\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.273959 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4sll\" (UniqueName: \"kubernetes.io/projected/63b0f1ec-c2d9-4005-ba10-839949dbbcac-kube-api-access-n4sll\") pod \"machine-config-server-8t7x8\" (UID: \"63b0f1ec-c2d9-4005-ba10-839949dbbcac\") " pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274006 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7grx\" (UniqueName: \"kubernetes.io/projected/6104d27e-fefa-4e2a-9b9e-62013c96f664-kube-api-access-m7grx\") pod \"multus-admission-controller-857f4d67dd-c68d5\" (UID: \"6104d27e-fefa-4e2a-9b9e-62013c96f664\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274033 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-csi-data-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274035 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2de0eb97-f51f-4468-9a68-eb9d6a7ce40d-proxy-tls\") pod \"machine-config-controller-84d6567774-f5gbr\" (UID: \"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274119 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a98572f-0fab-4dab-9935-6bf52cdc7fff-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wdm7\" (UID: \"0a98572f-0fab-4dab-9935-6bf52cdc7fff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274140 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/625494cc-c7b0-4a0a-811c-d4822b1c0acc-config-volume\") pod \"dns-default-kh9ph\" (UID: \"625494cc-c7b0-4a0a-811c-d4822b1c0acc\") " pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274182 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab926544-a708-445a-aaf9-0e3ad4593676-serving-cert\") pod \"service-ca-operator-777779d784-csktx\" (UID: \"ab926544-a708-445a-aaf9-0e3ad4593676\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274200 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53727b9f-9f5f-4f6e-8fa2-a6018c8225f5-cert\") pod \"ingress-canary-kq22p\" (UID: \"53727b9f-9f5f-4f6e-8fa2-a6018c8225f5\") " pod="openshift-ingress-canary/ingress-canary-kq22p" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274217 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a1d56e-00d0-4e88-bdfb-461578e327e6-config\") pod \"kube-controller-manager-operator-78b949d7b-w9gmp\" (UID: \"a0a1d56e-00d0-4e88-bdfb-461578e327e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274272 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6104d27e-fefa-4e2a-9b9e-62013c96f664-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c68d5\" (UID: \"6104d27e-fefa-4e2a-9b9e-62013c96f664\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274288 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-plugins-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274308 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-registration-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274348 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0a1d56e-00d0-4e88-bdfb-461578e327e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-w9gmp\" (UID: \"a0a1d56e-00d0-4e88-bdfb-461578e327e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274370 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2de0eb97-f51f-4468-9a68-eb9d6a7ce40d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f5gbr\" (UID: \"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274390 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-config-volume\") pod \"collect-profiles-29500620-rxhcg\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274428 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vzjw\" (UniqueName: \"kubernetes.io/projected/911d6570-6a65-42b8-a562-3e1ccdc8d562-kube-api-access-9vzjw\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274444 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wj4v\" (UniqueName: \"kubernetes.io/projected/46f85b66-5a30-4bef-909c-26750b18e72d-kube-api-access-5wj4v\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdzwk\" (UID: \"46f85b66-5a30-4bef-909c-26750b18e72d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274461 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrsnv\" (UniqueName: \"kubernetes.io/projected/c5b5487e-8a60-4967-b0f3-1d983c559f8a-kube-api-access-zrsnv\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274504 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjb78\" (UniqueName: \"kubernetes.io/projected/cb2244ea-f203-4f66-9a4d-aad5e58a5c46-kube-api-access-gjb78\") pod \"catalog-operator-68c6474976-bz9nm\" (UID: \"cb2244ea-f203-4f66-9a4d-aad5e58a5c46\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274525 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b31753-6a52-4364-b01f-9d50aeac7c13-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-78nss\" (UID: \"32b31753-6a52-4364-b01f-9d50aeac7c13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274547 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab926544-a708-445a-aaf9-0e3ad4593676-config\") pod \"service-ca-operator-777779d784-csktx\" (UID: \"ab926544-a708-445a-aaf9-0e3ad4593676\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274590 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnzzr\" (UniqueName: \"kubernetes.io/projected/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-kube-api-access-lnzzr\") pod \"collect-profiles-29500620-rxhcg\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274615 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hvpn\" (UniqueName: \"kubernetes.io/projected/53727b9f-9f5f-4f6e-8fa2-a6018c8225f5-kube-api-access-6hvpn\") pod \"ingress-canary-kq22p\" (UID: \"53727b9f-9f5f-4f6e-8fa2-a6018c8225f5\") " pod="openshift-ingress-canary/ingress-canary-kq22p" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274664 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/debfaa13-5820-4570-a447-8ef48903144c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2wzzt\" (UID: \"debfaa13-5820-4570-a447-8ef48903144c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274691 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cb2244ea-f203-4f66-9a4d-aad5e58a5c46-profile-collector-cert\") pod \"catalog-operator-68c6474976-bz9nm\" (UID: \"cb2244ea-f203-4f66-9a4d-aad5e58a5c46\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274709 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b31753-6a52-4364-b01f-9d50aeac7c13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-78nss\" (UID: \"32b31753-6a52-4364-b01f-9d50aeac7c13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274750 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-secret-volume\") pod \"collect-profiles-29500620-rxhcg\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274767 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn92f\" (UniqueName: \"kubernetes.io/projected/2c9074bc-889d-4ce7-a250-6fc5984703e0-kube-api-access-xn92f\") pod \"marketplace-operator-79b997595-zcf44\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274782 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b06f9eba-0e3d-47fb-a386-a166987e78fd-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hlnsv\" (UID: \"b06f9eba-0e3d-47fb-a386-a166987e78fd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274798 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b31753-6a52-4364-b01f-9d50aeac7c13-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-78nss\" (UID: \"32b31753-6a52-4364-b01f-9d50aeac7c13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274838 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0a1d56e-00d0-4e88-bdfb-461578e327e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-w9gmp\" (UID: \"a0a1d56e-00d0-4e88-bdfb-461578e327e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274854 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a98572f-0fab-4dab-9935-6bf52cdc7fff-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wdm7\" (UID: \"0a98572f-0fab-4dab-9935-6bf52cdc7fff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274869 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zflp4\" (UniqueName: \"kubernetes.io/projected/625494cc-c7b0-4a0a-811c-d4822b1c0acc-kube-api-access-zflp4\") pod \"dns-default-kh9ph\" (UID: \"625494cc-c7b0-4a0a-811c-d4822b1c0acc\") " pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274907 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qsg\" (UniqueName: \"kubernetes.io/projected/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-kube-api-access-p2qsg\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274926 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5b5487e-8a60-4967-b0f3-1d983c559f8a-webhook-cert\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274948 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/911d6570-6a65-42b8-a562-3e1ccdc8d562-proxy-tls\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.274989 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc-config\") pod \"kube-apiserver-operator-766d6c64bb-mkplf\" (UID: \"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275007 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b06f9eba-0e3d-47fb-a386-a166987e78fd-srv-cert\") pod \"olm-operator-6b444d44fb-hlnsv\" (UID: \"b06f9eba-0e3d-47fb-a386-a166987e78fd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275024 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/37b1d658-0b12-4afd-9b42-9f54f553d432-signing-cabundle\") pod \"service-ca-9c57cc56f-r47km\" (UID: \"37b1d658-0b12-4afd-9b42-9f54f553d432\") " pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275056 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mkplf\" (UID: \"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275084 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkxj7\" (UniqueName: \"kubernetes.io/projected/bace5e2a-2c1a-433c-bc00-9121a45aa515-kube-api-access-jkxj7\") pod \"migrator-59844c95c7-qr8r7\" (UID: \"bace5e2a-2c1a-433c-bc00-9121a45aa515\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275102 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/63b0f1ec-c2d9-4005-ba10-839949dbbcac-certs\") pod \"machine-config-server-8t7x8\" (UID: \"63b0f1ec-c2d9-4005-ba10-839949dbbcac\") " pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275116 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbkx\" (UniqueName: \"kubernetes.io/projected/2de0eb97-f51f-4468-9a68-eb9d6a7ce40d-kube-api-access-4lbkx\") pod \"machine-config-controller-84d6567774-f5gbr\" (UID: \"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275152 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5b5487e-8a60-4967-b0f3-1d983c559f8a-tmpfs\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275166 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5x7n\" (UniqueName: \"kubernetes.io/projected/37b1d658-0b12-4afd-9b42-9f54f553d432-kube-api-access-s5x7n\") pod \"service-ca-9c57cc56f-r47km\" (UID: \"37b1d658-0b12-4afd-9b42-9f54f553d432\") " pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275184 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5b5487e-8a60-4967-b0f3-1d983c559f8a-apiservice-cert\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275225 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bksrl\" (UniqueName: \"kubernetes.io/projected/debfaa13-5820-4570-a447-8ef48903144c-kube-api-access-bksrl\") pod \"package-server-manager-789f6589d5-2wzzt\" (UID: \"debfaa13-5820-4570-a447-8ef48903144c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275245 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/625494cc-c7b0-4a0a-811c-d4822b1c0acc-metrics-tls\") pod \"dns-default-kh9ph\" (UID: \"625494cc-c7b0-4a0a-811c-d4822b1c0acc\") " pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.275524 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zcf44\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.282710 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2de0eb97-f51f-4468-9a68-eb9d6a7ce40d-proxy-tls\") pod \"machine-config-controller-84d6567774-f5gbr\" (UID: \"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.283484 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc-config\") pod \"kube-apiserver-operator-766d6c64bb-mkplf\" (UID: \"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.284680 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5b5487e-8a60-4967-b0f3-1d983c559f8a-tmpfs\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.285299 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-socket-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.285605 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/37b1d658-0b12-4afd-9b42-9f54f553d432-signing-cabundle\") pod \"service-ca-9c57cc56f-r47km\" (UID: \"37b1d658-0b12-4afd-9b42-9f54f553d432\") " pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.285694 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-mountpoint-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.285897 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:31.785620305 +0000 UTC m=+152.088134694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.287175 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cb2244ea-f203-4f66-9a4d-aad5e58a5c46-srv-cert\") pod \"catalog-operator-68c6474976-bz9nm\" (UID: \"cb2244ea-f203-4f66-9a4d-aad5e58a5c46\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.288439 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/911d6570-6a65-42b8-a562-3e1ccdc8d562-images\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.296550 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/37b1d658-0b12-4afd-9b42-9f54f553d432-signing-key\") pod \"service-ca-9c57cc56f-r47km\" (UID: \"37b1d658-0b12-4afd-9b42-9f54f553d432\") " pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.296724 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/911d6570-6a65-42b8-a562-3e1ccdc8d562-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.297850 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/63b0f1ec-c2d9-4005-ba10-839949dbbcac-certs\") pod \"machine-config-server-8t7x8\" (UID: \"63b0f1ec-c2d9-4005-ba10-839949dbbcac\") " pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.300550 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2de0eb97-f51f-4468-9a68-eb9d6a7ce40d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-f5gbr\" (UID: \"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.304814 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-config-volume\") pod \"collect-profiles-29500620-rxhcg\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.309048 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5b5487e-8a60-4967-b0f3-1d983c559f8a-apiservice-cert\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.309467 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccp75\" (UniqueName: \"kubernetes.io/projected/ab4249b9-1751-45d6-be3f-58668c4542bd-kube-api-access-ccp75\") pod \"apiserver-7bbb656c7d-p9lvq\" (UID: \"ab4249b9-1751-45d6-be3f-58668c4542bd\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.309899 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a98572f-0fab-4dab-9935-6bf52cdc7fff-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wdm7\" (UID: \"0a98572f-0fab-4dab-9935-6bf52cdc7fff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.314515 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/625494cc-c7b0-4a0a-811c-d4822b1c0acc-config-volume\") pod \"dns-default-kh9ph\" (UID: \"625494cc-c7b0-4a0a-811c-d4822b1c0acc\") " pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.315247 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a1d56e-00d0-4e88-bdfb-461578e327e6-config\") pod \"kube-controller-manager-operator-78b949d7b-w9gmp\" (UID: \"a0a1d56e-00d0-4e88-bdfb-461578e327e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.315417 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-plugins-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.315492 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6104d27e-fefa-4e2a-9b9e-62013c96f664-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c68d5\" (UID: \"6104d27e-fefa-4e2a-9b9e-62013c96f664\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.315697 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-registration-dir\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.315900 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab926544-a708-445a-aaf9-0e3ad4593676-config\") pod \"service-ca-operator-777779d784-csktx\" (UID: \"ab926544-a708-445a-aaf9-0e3ad4593676\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.317745 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-secret-volume\") pod \"collect-profiles-29500620-rxhcg\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.318003 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cb2244ea-f203-4f66-9a4d-aad5e58a5c46-profile-collector-cert\") pod \"catalog-operator-68c6474976-bz9nm\" (UID: \"cb2244ea-f203-4f66-9a4d-aad5e58a5c46\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.318907 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32b31753-6a52-4364-b01f-9d50aeac7c13-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-78nss\" (UID: \"32b31753-6a52-4364-b01f-9d50aeac7c13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.320696 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32b31753-6a52-4364-b01f-9d50aeac7c13-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-78nss\" (UID: \"32b31753-6a52-4364-b01f-9d50aeac7c13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.324754 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53727b9f-9f5f-4f6e-8fa2-a6018c8225f5-cert\") pod \"ingress-canary-kq22p\" (UID: \"53727b9f-9f5f-4f6e-8fa2-a6018c8225f5\") " pod="openshift-ingress-canary/ingress-canary-kq22p" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.325861 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0a1d56e-00d0-4e88-bdfb-461578e327e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-w9gmp\" (UID: \"a0a1d56e-00d0-4e88-bdfb-461578e327e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.327617 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/625494cc-c7b0-4a0a-811c-d4822b1c0acc-metrics-tls\") pod \"dns-default-kh9ph\" (UID: \"625494cc-c7b0-4a0a-811c-d4822b1c0acc\") " pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.327958 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/911d6570-6a65-42b8-a562-3e1ccdc8d562-proxy-tls\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.328590 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/63b0f1ec-c2d9-4005-ba10-839949dbbcac-node-bootstrap-token\") pod \"machine-config-server-8t7x8\" (UID: \"63b0f1ec-c2d9-4005-ba10-839949dbbcac\") " pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.330646 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a98572f-0fab-4dab-9935-6bf52cdc7fff-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wdm7\" (UID: \"0a98572f-0fab-4dab-9935-6bf52cdc7fff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.330917 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5b5487e-8a60-4967-b0f3-1d983c559f8a-webhook-cert\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.331143 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjxt2\" (UniqueName: \"kubernetes.io/projected/b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb-kube-api-access-tjxt2\") pod \"router-default-5444994796-7vhgv\" (UID: \"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb\") " pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.331163 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab926544-a708-445a-aaf9-0e3ad4593676-serving-cert\") pod \"service-ca-operator-777779d784-csktx\" (UID: \"ab926544-a708-445a-aaf9-0e3ad4593676\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.331689 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b06f9eba-0e3d-47fb-a386-a166987e78fd-srv-cert\") pod \"olm-operator-6b444d44fb-hlnsv\" (UID: \"b06f9eba-0e3d-47fb-a386-a166987e78fd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.332383 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mkplf\" (UID: \"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.332832 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/debfaa13-5820-4570-a447-8ef48903144c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2wzzt\" (UID: \"debfaa13-5820-4570-a447-8ef48903144c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.337181 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/46f85b66-5a30-4bef-909c-26750b18e72d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdzwk\" (UID: \"46f85b66-5a30-4bef-909c-26750b18e72d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.337992 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkvfb\" (UniqueName: \"kubernetes.io/projected/0a98572f-0fab-4dab-9935-6bf52cdc7fff-kube-api-access-hkvfb\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wdm7\" (UID: \"0a98572f-0fab-4dab-9935-6bf52cdc7fff\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.346252 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b06f9eba-0e3d-47fb-a386-a166987e78fd-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hlnsv\" (UID: \"b06f9eba-0e3d-47fb-a386-a166987e78fd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.353045 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zcf44\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.367604 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.368557 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hvpn\" (UniqueName: \"kubernetes.io/projected/53727b9f-9f5f-4f6e-8fa2-a6018c8225f5-kube-api-access-6hvpn\") pod \"ingress-canary-kq22p\" (UID: \"53727b9f-9f5f-4f6e-8fa2-a6018c8225f5\") " pod="openshift-ingress-canary/ingress-canary-kq22p" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.374317 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zt9ng"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.378416 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.378954 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.379049 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:31.879034422 +0000 UTC m=+152.181548811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.391006 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.397778 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkxj7\" (UniqueName: \"kubernetes.io/projected/bace5e2a-2c1a-433c-bc00-9121a45aa515-kube-api-access-jkxj7\") pod \"migrator-59844c95c7-qr8r7\" (UID: \"bace5e2a-2c1a-433c-bc00-9121a45aa515\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.398549 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsw9p\" (UniqueName: \"kubernetes.io/projected/b06f9eba-0e3d-47fb-a386-a166987e78fd-kube-api-access-hsw9p\") pod \"olm-operator-6b444d44fb-hlnsv\" (UID: \"b06f9eba-0e3d-47fb-a386-a166987e78fd\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.420815 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.435906 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.431553 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbkx\" (UniqueName: \"kubernetes.io/projected/2de0eb97-f51f-4468-9a68-eb9d6a7ce40d-kube-api-access-4lbkx\") pod \"machine-config-controller-84d6567774-f5gbr\" (UID: \"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.444963 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.448319 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w84l4\" (UniqueName: \"kubernetes.io/projected/ab926544-a708-445a-aaf9-0e3ad4593676-kube-api-access-w84l4\") pod \"service-ca-operator-777779d784-csktx\" (UID: \"ab926544-a708-445a-aaf9-0e3ad4593676\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.451433 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.459041 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4sll\" (UniqueName: \"kubernetes.io/projected/63b0f1ec-c2d9-4005-ba10-839949dbbcac-kube-api-access-n4sll\") pod \"machine-config-server-8t7x8\" (UID: \"63b0f1ec-c2d9-4005-ba10-839949dbbcac\") " pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.460704 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" event={"ID":"3c0670a6-888e-40e3-bf5d-82779e70dd1c","Type":"ContainerStarted","Data":"d435dac504fe3034e5527129f376c6ed65b5ea3e1fe83d1eb8463d6282795a18"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.460752 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" event={"ID":"3c0670a6-888e-40e3-bf5d-82779e70dd1c","Type":"ContainerStarted","Data":"167d1cdbeb24f93927ece3e3fa3df789c23a8308344e8f29012657e06e53e904"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.461268 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.464254 4721 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-ffkjd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.464301 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" podUID="3c0670a6-888e-40e3-bf5d-82779e70dd1c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.475338 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.475821 4721 generic.go:334] "Generic (PLEG): container finished" podID="64fc7a32-5852-4e03-b1b7-1663f7f52b65" containerID="a4cef40e8a1ba99ffc6b7c7bf185e7d2180fb0fbc1aa0082f5c2fcea210eadb0" exitCode=0 Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.475986 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" event={"ID":"64fc7a32-5852-4e03-b1b7-1663f7f52b65","Type":"ContainerDied","Data":"a4cef40e8a1ba99ffc6b7c7bf185e7d2180fb0fbc1aa0082f5c2fcea210eadb0"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.476038 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" event={"ID":"64fc7a32-5852-4e03-b1b7-1663f7f52b65","Type":"ContainerStarted","Data":"6433b629d1bde4bb980befd60ac6efea557893fc4cea82b8bd13eb528ae2b9a2"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.478649 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7grx\" (UniqueName: \"kubernetes.io/projected/6104d27e-fefa-4e2a-9b9e-62013c96f664-kube-api-access-m7grx\") pod \"multus-admission-controller-857f4d67dd-c68d5\" (UID: \"6104d27e-fefa-4e2a-9b9e-62013c96f664\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.480711 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.481374 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:31.98135161 +0000 UTC m=+152.283865999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.490552 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" event={"ID":"8f1e834f-23b5-42a5-9d13-b9e5720a597c","Type":"ContainerStarted","Data":"e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.491192 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.492467 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" event={"ID":"67f000c5-5173-44e3-89e6-446c345a6c05","Type":"ContainerStarted","Data":"85b5ad265fd3a6bc88b82d6a0a25866b976d9a961360c1827fefaea8446a285c"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.503161 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kq22p" Feb 02 13:03:31 crc kubenswrapper[4721]: W0202 13:03:31.530613 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf02a63f_5e62_47ff_baf5_1dc1e95dc1ad.slice/crio-72a9519fa163c4d5e947e7bea8efe4073413ddbd4dcca1685658420850559431 WatchSource:0}: Error finding container 72a9519fa163c4d5e947e7bea8efe4073413ddbd4dcca1685658420850559431: Status 404 returned error can't find the container with id 72a9519fa163c4d5e947e7bea8efe4073413ddbd4dcca1685658420850559431 Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.538458 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8t7x8" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.543715 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5x7n\" (UniqueName: \"kubernetes.io/projected/37b1d658-0b12-4afd-9b42-9f54f553d432-kube-api-access-s5x7n\") pod \"service-ca-9c57cc56f-r47km\" (UID: \"37b1d658-0b12-4afd-9b42-9f54f553d432\") " pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.548890 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.553992 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bksrl\" (UniqueName: \"kubernetes.io/projected/debfaa13-5820-4570-a447-8ef48903144c-kube-api-access-bksrl\") pod \"package-server-manager-789f6589d5-2wzzt\" (UID: \"debfaa13-5820-4570-a447-8ef48903144c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.558155 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" event={"ID":"083f0d8a-e0c4-46ae-8993-8547dd260553","Type":"ContainerStarted","Data":"533a6f959716e9909cf0141a19ed1a201052ec95a0fcc8a95909ea7a1c40cad5"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.558278 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" event={"ID":"083f0d8a-e0c4-46ae-8993-8547dd260553","Type":"ContainerStarted","Data":"0ce15c1c8fdb67f5b17f39530bbd9d91767933a832ff9983c81c0606885a8b64"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.577519 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrsnv\" (UniqueName: \"kubernetes.io/projected/c5b5487e-8a60-4967-b0f3-1d983c559f8a-kube-api-access-zrsnv\") pod \"packageserver-d55dfcdfc-krxdl\" (UID: \"c5b5487e-8a60-4967-b0f3-1d983c559f8a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.586022 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.586211 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.08618684 +0000 UTC m=+152.388701229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.586564 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.586673 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.587117 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.087098835 +0000 UTC m=+152.389613224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.593915 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wj4v\" (UniqueName: \"kubernetes.io/projected/46f85b66-5a30-4bef-909c-26750b18e72d-kube-api-access-5wj4v\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdzwk\" (UID: \"46f85b66-5a30-4bef-909c-26750b18e72d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.594810 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7vhgv" event={"ID":"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb","Type":"ContainerStarted","Data":"27861f5afd594c7c97e253d8a6c6893c1f11a45ddd73eafa17299df4c668a81e"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.598275 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" event={"ID":"962524c6-7992-43d5-a7f3-5fdd04297f24","Type":"ContainerStarted","Data":"e1c6b11699215c240779ba4ffc084b0f044db3750d6c816f2d805a78f36b24e5"} Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.615184 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vzjw\" (UniqueName: \"kubernetes.io/projected/911d6570-6a65-42b8-a562-3e1ccdc8d562-kube-api-access-9vzjw\") pod \"machine-config-operator-74547568cd-c958f\" (UID: \"911d6570-6a65-42b8-a562-3e1ccdc8d562\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.619834 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32b31753-6a52-4364-b01f-9d50aeac7c13-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-78nss\" (UID: \"32b31753-6a52-4364-b01f-9d50aeac7c13\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.637901 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zg529"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.647643 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0a1d56e-00d0-4e88-bdfb-461578e327e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-w9gmp\" (UID: \"a0a1d56e-00d0-4e88-bdfb-461578e327e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.658405 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mkplf\" (UID: \"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.661346 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn92f\" (UniqueName: \"kubernetes.io/projected/2c9074bc-889d-4ce7-a250-6fc5984703e0-kube-api-access-xn92f\") pod \"marketplace-operator-79b997595-zcf44\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.687323 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zflp4\" (UniqueName: \"kubernetes.io/projected/625494cc-c7b0-4a0a-811c-d4822b1c0acc-kube-api-access-zflp4\") pod \"dns-default-kh9ph\" (UID: \"625494cc-c7b0-4a0a-811c-d4822b1c0acc\") " pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.687771 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.689034 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.189004302 +0000 UTC m=+152.491518691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.695443 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.704426 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.704671 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qsg\" (UniqueName: \"kubernetes.io/projected/6dc5b220-3e84-4a0a-9f7a-f27a007436f6-kube-api-access-p2qsg\") pod \"csi-hostpathplugin-7gtg4\" (UID: \"6dc5b220-3e84-4a0a-9f7a-f27a007436f6\") " pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.708875 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.716181 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.728468 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.732458 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjb78\" (UniqueName: \"kubernetes.io/projected/cb2244ea-f203-4f66-9a4d-aad5e58a5c46-kube-api-access-gjb78\") pod \"catalog-operator-68c6474976-bz9nm\" (UID: \"cb2244ea-f203-4f66-9a4d-aad5e58a5c46\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.736381 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.742630 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.747741 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.764846 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-r47km" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.765886 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.769775 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnzzr\" (UniqueName: \"kubernetes.io/projected/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-kube-api-access-lnzzr\") pod \"collect-profiles-29500620-rxhcg\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.784408 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.789551 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.790059 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.790438 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.290422335 +0000 UTC m=+152.592936724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.798346 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.831599 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.848347 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.848998 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.888888 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jrfhj"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.891059 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.891424 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.391410236 +0000 UTC m=+152.693924625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.917122 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.917585 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.943925 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-2dsnx"] Feb 02 13:03:31 crc kubenswrapper[4721]: I0202 13:03:31.993134 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:31 crc kubenswrapper[4721]: E0202 13:03:31.993585 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.493562429 +0000 UTC m=+152.796077008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: W0202 13:03:32.003529 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf028d7b1_e474_45f8_9c4e_d1b2322175c7.slice/crio-80061aa979518cc3deff6b9b39f3aef1ce5a33b6cbfaffcec4325f4ed5c8e70e WatchSource:0}: Error finding container 80061aa979518cc3deff6b9b39f3aef1ce5a33b6cbfaffcec4325f4ed5c8e70e: Status 404 returned error can't find the container with id 80061aa979518cc3deff6b9b39f3aef1ce5a33b6cbfaffcec4325f4ed5c8e70e Feb 02 13:03:32 crc kubenswrapper[4721]: W0202 13:03:32.027191 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75fc13c2_ccc5_46a0_8a65_d6bc5340baab.slice/crio-e908c7ce1d810feb7859e0cf4d35f00393b9ea94e002a59b76c024bc0ace56e9 WatchSource:0}: Error finding container e908c7ce1d810feb7859e0cf4d35f00393b9ea94e002a59b76c024bc0ace56e9: Status 404 returned error can't find the container with id e908c7ce1d810feb7859e0cf4d35f00393b9ea94e002a59b76c024bc0ace56e9 Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.061637 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.084492 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq"] Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.099973 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.100137 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.600107787 +0000 UTC m=+152.902622176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.100371 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.100752 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.600736535 +0000 UTC m=+152.903250924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.114931 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-f4l8v"] Feb 02 13:03:32 crc kubenswrapper[4721]: W0202 13:03:32.135575 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae3f417e_2bae_44dd_973f_5314b6f64972.slice/crio-03d46b30c74ef5c8430b448c7ad678889af4272d7acbd48aceae6628fd4f71b5 WatchSource:0}: Error finding container 03d46b30c74ef5c8430b448c7ad678889af4272d7acbd48aceae6628fd4f71b5: Status 404 returned error can't find the container with id 03d46b30c74ef5c8430b448c7ad678889af4272d7acbd48aceae6628fd4f71b5 Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.201818 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.202126 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.702112227 +0000 UTC m=+153.004626606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.209294 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7"] Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.209696 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h"] Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.224713 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2jzch"] Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.228020 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7"] Feb 02 13:03:32 crc kubenswrapper[4721]: W0202 13:03:32.255498 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbace5e2a_2c1a_433c_bc00_9121a45aa515.slice/crio-37015924ad96e03e0f661a4fba73db346b9adb1eb9a8b110fa18f4af486c1aca WatchSource:0}: Error finding container 37015924ad96e03e0f661a4fba73db346b9adb1eb9a8b110fa18f4af486c1aca: Status 404 returned error can't find the container with id 37015924ad96e03e0f661a4fba73db346b9adb1eb9a8b110fa18f4af486c1aca Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.300623 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv"] Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.303436 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.304057 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.804046554 +0000 UTC m=+153.106560943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.404767 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.405266 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:32.905248211 +0000 UTC m=+153.207762600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.511475 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.512486 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.012470447 +0000 UTC m=+153.314984826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.517331 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss"] Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.562296 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" podStartSLOduration=127.562277579 podStartE2EDuration="2m7.562277579s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:32.524456287 +0000 UTC m=+152.826970666" watchObservedRunningTime="2026-02-02 13:03:32.562277579 +0000 UTC m=+152.864791968" Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.579467 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kq22p"] Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.614054 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.614576 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.114555719 +0000 UTC m=+153.417070108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.634701 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" event={"ID":"4a636bbb-70b8-4b2a-96c6-94f9edba40cc","Type":"ContainerStarted","Data":"a33996efe12dac3d19d0fa4037ecc635f6ba9b302da27d202a6e47235462ef26"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.654836 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" event={"ID":"75fc13c2-ccc5-46a0-8a65-d6bc5340baab","Type":"ContainerStarted","Data":"e908c7ce1d810feb7859e0cf4d35f00393b9ea94e002a59b76c024bc0ace56e9"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.668240 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" event={"ID":"67f000c5-5173-44e3-89e6-446c345a6c05","Type":"ContainerStarted","Data":"3a1667a5a46eba8d886a8fbbd8d7f7a9a2bc174ba18feb3a4ef74f6861bf3271"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.668297 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" event={"ID":"67f000c5-5173-44e3-89e6-446c345a6c05","Type":"ContainerStarted","Data":"92082378009f2e00a893951b41ea2c82fc949ed54229ccbc4da67e0536e3538c"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.674044 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2dsnx" event={"ID":"ae3f417e-2bae-44dd-973f-5314b6f64972","Type":"ContainerStarted","Data":"03d46b30c74ef5c8430b448c7ad678889af4272d7acbd48aceae6628fd4f71b5"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.700164 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8t7x8" event={"ID":"63b0f1ec-c2d9-4005-ba10-839949dbbcac","Type":"ContainerStarted","Data":"516ab5e09d3a567d7aa34953df58fe969f47679e7e5877e0ef18bd895e330728"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.700230 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8t7x8" event={"ID":"63b0f1ec-c2d9-4005-ba10-839949dbbcac","Type":"ContainerStarted","Data":"9f99838276207803d8e457f0e4737ba0b54dcc0e9d633a51cbe24ccd90f651a3"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.715874 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.716428 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.216405825 +0000 UTC m=+153.518920214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.727088 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zt9ng" event={"ID":"af02a63f-5e62-47ff-baf5-1dc1e95dc1ad","Type":"ContainerStarted","Data":"bdc7f242349ef501ae60453f7027fd3b9ab0c5fdb0624d98e48543734baa6123"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.727132 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zt9ng" event={"ID":"af02a63f-5e62-47ff-baf5-1dc1e95dc1ad","Type":"ContainerStarted","Data":"72a9519fa163c4d5e947e7bea8efe4073413ddbd4dcca1685658420850559431"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.728002 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zt9ng" Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.748040 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" event={"ID":"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d","Type":"ContainerStarted","Data":"204dbbb074af32312956d8c5808e584109e5348e60b7f0ba197a03fb8d7567c5"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.750202 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" event={"ID":"962524c6-7992-43d5-a7f3-5fdd04297f24","Type":"ContainerStarted","Data":"81f1ed113ff28d261e45b8f089ff55331cfa48c5e28c0300ad3ede4e1aa70b95"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.753181 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.760491 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-f4l8v" event={"ID":"61358eab-20de-46bb-9701-dc736e6eb5ff","Type":"ContainerStarted","Data":"533d18c71783fb7d55dc1e3ccbdf776ca8bae4e98dcbd5fd2e6684297c6d4fb5"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.764351 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" event={"ID":"70d74e61-4d44-4a6c-8a14-16e131d79e47","Type":"ContainerStarted","Data":"9077b0d5727400ff737670e938f50acb1eeaf69ec2d4125726b027b461a0bd7d"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.778157 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf"] Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.781449 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" event={"ID":"202d08f0-f5ea-4414-b2e6-5a690148a823","Type":"ContainerStarted","Data":"5677dea813e91cff0125caea41dbe47841266e23be8dbe14bda6d75d556e4c6c"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.781475 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" event={"ID":"202d08f0-f5ea-4414-b2e6-5a690148a823","Type":"ContainerStarted","Data":"1d70ffe0b3734085db9dfe928cc8392846bfd74b2233917dacca3de2113ee019"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.799528 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7vhgv" event={"ID":"b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb","Type":"ContainerStarted","Data":"85c792eb18ace81b853014dd66f8dc7071938962eedf924be3e1db2064da8739"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.802699 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zg529" event={"ID":"a1dde568-291e-40bf-9df7-18cd5449d0aa","Type":"ContainerStarted","Data":"8d1176f054af953add2a2532e07fd308e898dabf738f3dad9dfa3ffcd452a99d"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.803795 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" event={"ID":"3bbe9190-62bb-4079-afa7-adc9e970eae6","Type":"ContainerStarted","Data":"59fdf7e0fd87927cd832a352d934961ab5ba72d93251892b8dde5809f3cb586d"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.805633 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7" event={"ID":"bace5e2a-2c1a-433c-bc00-9121a45aa515","Type":"ContainerStarted","Data":"37015924ad96e03e0f661a4fba73db346b9adb1eb9a8b110fa18f4af486c1aca"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.807599 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" event={"ID":"f028d7b1-e474-45f8-9c4e-d1b2322175c7","Type":"ContainerStarted","Data":"80061aa979518cc3deff6b9b39f3aef1ce5a33b6cbfaffcec4325f4ed5c8e70e"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.812836 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zcf44"] Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.816798 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.821595 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.321573243 +0000 UTC m=+153.624087632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.823555 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" event={"ID":"0a98572f-0fab-4dab-9935-6bf52cdc7fff","Type":"ContainerStarted","Data":"c27507bd9c98546b987f4412d4fcbe150bb83423cdf6072cd7347f04919d8804"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.825756 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" event={"ID":"ab4249b9-1751-45d6-be3f-58668c4542bd","Type":"ContainerStarted","Data":"7f732d595674c6b44c99146a616adea5454f90e5957ad804ba076c7a4cc56ed6"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.828690 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" event={"ID":"f2a74f12-1bed-4744-9dec-57282d5301eb","Type":"ContainerStarted","Data":"665a012355c7f53b5297723bfd7f4326c471d3e631d117239a2f21c841495568"} Feb 02 13:03:32 crc kubenswrapper[4721]: I0202 13:03:32.919756 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:32 crc kubenswrapper[4721]: E0202 13:03:32.928617 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.424026655 +0000 UTC m=+153.726541044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.018686 4721 patch_prober.go:28] interesting pod/downloads-7954f5f757-zt9ng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.018735 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zt9ng" podUID="af02a63f-5e62-47ff-baf5-1dc1e95dc1ad" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.019037 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" podStartSLOduration=127.019026897 podStartE2EDuration="2m7.019026897s" podCreationTimestamp="2026-02-02 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:32.956492436 +0000 UTC m=+153.259006845" watchObservedRunningTime="2026-02-02 13:03:33.019026897 +0000 UTC m=+153.321541286" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.021372 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.021921 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.022459 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.522436584 +0000 UTC m=+153.824950973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.123686 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.124296 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.624282298 +0000 UTC m=+153.926796687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.229737 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.230122 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.730091866 +0000 UTC m=+154.032606255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.332156 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.332896 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.832875827 +0000 UTC m=+154.135390226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.395747 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.433377 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.433876 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:33.933849257 +0000 UTC m=+154.236363646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.537541 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.537904 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.037889895 +0000 UTC m=+154.340404284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.537912 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-pnfph" podStartSLOduration=128.537899395 podStartE2EDuration="2m8.537899395s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:33.537015559 +0000 UTC m=+153.839529948" watchObservedRunningTime="2026-02-02 13:03:33.537899395 +0000 UTC m=+153.840413794" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.638793 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.639443 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.13940958 +0000 UTC m=+154.441923969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.639911 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.640339 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.140309006 +0000 UTC m=+154.442823395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.663660 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:33 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:33 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:33 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.663721 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.740937 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.741529 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.241513422 +0000 UTC m=+154.544027811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.766939 4721 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fqbhq container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.766987 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.826413 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zt9ng" podStartSLOduration=128.826387947 podStartE2EDuration="2m8.826387947s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:33.766716886 +0000 UTC m=+154.069231275" watchObservedRunningTime="2026-02-02 13:03:33.826387947 +0000 UTC m=+154.128902336" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.842979 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.843403 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.343391108 +0000 UTC m=+154.645905497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.886093 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8t7x8" podStartSLOduration=5.886056386 podStartE2EDuration="5.886056386s" podCreationTimestamp="2026-02-02 13:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:33.833392675 +0000 UTC m=+154.135907084" watchObservedRunningTime="2026-02-02 13:03:33.886056386 +0000 UTC m=+154.188570785" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.887509 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-7vhgv" podStartSLOduration=128.887504237 podStartE2EDuration="2m8.887504237s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:33.886368856 +0000 UTC m=+154.188883245" watchObservedRunningTime="2026-02-02 13:03:33.887504237 +0000 UTC m=+154.190018626" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.891102 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" event={"ID":"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc","Type":"ContainerStarted","Data":"8de304604797617bd37941c34b8dadab3dda32a570ac8404e01cefa5aa3f8bd2"} Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.924494 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pvz9q" podStartSLOduration=128.924460835 podStartE2EDuration="2m8.924460835s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:33.914055159 +0000 UTC m=+154.216569548" watchObservedRunningTime="2026-02-02 13:03:33.924460835 +0000 UTC m=+154.226975234" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.933487 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt"] Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.946647 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:33 crc kubenswrapper[4721]: E0202 13:03:33.946925 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.446910861 +0000 UTC m=+154.749425250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.949190 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7" event={"ID":"bace5e2a-2c1a-433c-bc00-9121a45aa515","Type":"ContainerStarted","Data":"c313e656890a11fee3ecb133b8c052827b22699f41ea38612bd581629105bf0e"} Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.964921 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" event={"ID":"b06f9eba-0e3d-47fb-a386-a166987e78fd","Type":"ContainerStarted","Data":"9a3a10624151d9673d31391fff8105e05226e17aa8e1b700d2dd15cc01de39b9"} Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.979394 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" podStartSLOduration=128.97937452 podStartE2EDuration="2m8.97937452s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:33.975740427 +0000 UTC m=+154.278254816" watchObservedRunningTime="2026-02-02 13:03:33.97937452 +0000 UTC m=+154.281888909" Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.982431 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kq22p" event={"ID":"53727b9f-9f5f-4f6e-8fa2-a6018c8225f5","Type":"ContainerStarted","Data":"daf20f88f77c8ece317b4aa2425aba24b61b99d07deb21c59cb90fffbfa4c955"} Feb 02 13:03:33 crc kubenswrapper[4721]: I0202 13:03:33.991393 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" event={"ID":"2c9074bc-889d-4ce7-a250-6fc5984703e0","Type":"ContainerStarted","Data":"420f57653af414c784badc1fa24ed323a8bb52594720023f0b3bf03137ac12b0"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.021056 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" event={"ID":"32b31753-6a52-4364-b01f-9d50aeac7c13","Type":"ContainerStarted","Data":"16089a7aa7a271649dfcf92598ea5774a7dc62f01512306dffc4a2d67486cf7c"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.026996 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-f4l8v" event={"ID":"61358eab-20de-46bb-9701-dc736e6eb5ff","Type":"ContainerStarted","Data":"cdc58e2761d43372f1d268672fb769c108a9bc911c5903744f58b81765d9741f"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.028355 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.034403 4721 patch_prober.go:28] interesting pod/console-operator-58897d9998-f4l8v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.034436 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-f4l8v" podUID="61358eab-20de-46bb-9701-dc736e6eb5ff" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.036799 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" event={"ID":"3bbe9190-62bb-4079-afa7-adc9e970eae6","Type":"ContainerStarted","Data":"222b5aac1078254682082aa0f7d10df6cb374707e022072e262ca5b6a68337a8"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.047513 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.047888 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.54787529 +0000 UTC m=+154.850389689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.060765 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" event={"ID":"64fc7a32-5852-4e03-b1b7-1663f7f52b65","Type":"ContainerStarted","Data":"cabf915466d27de266f83931f8e2f058da9b66dbd2cfd11eccef4fd3d8d537d9"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.072279 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-f4l8v" podStartSLOduration=129.072255551 podStartE2EDuration="2m9.072255551s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:34.070191613 +0000 UTC m=+154.372706012" watchObservedRunningTime="2026-02-02 13:03:34.072255551 +0000 UTC m=+154.374769950" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.074317 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xbv2j" podStartSLOduration=129.074306299 podStartE2EDuration="2m9.074306299s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:33.998264235 +0000 UTC m=+154.300778624" watchObservedRunningTime="2026-02-02 13:03:34.074306299 +0000 UTC m=+154.376820688" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.075518 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" event={"ID":"75fc13c2-ccc5-46a0-8a65-d6bc5340baab","Type":"ContainerStarted","Data":"4205ae6283ef57c9351203b27201b46a698d790ed7fdc44a4f679e9a157827fb"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.106522 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zg529" event={"ID":"a1dde568-291e-40bf-9df7-18cd5449d0aa","Type":"ContainerStarted","Data":"24cc7e320cb8aa5dd535e97478c79d830906512c948d824576751370851008a0"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.123691 4721 generic.go:334] "Generic (PLEG): container finished" podID="ab4249b9-1751-45d6-be3f-58668c4542bd" containerID="9736769bdc705fc91b1d3ed306f50b43c662f0f5c195d71dedfa10fcfe69e90f" exitCode=0 Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.123758 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" event={"ID":"ab4249b9-1751-45d6-be3f-58668c4542bd","Type":"ContainerDied","Data":"9736769bdc705fc91b1d3ed306f50b43c662f0f5c195d71dedfa10fcfe69e90f"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.135921 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2dsnx" event={"ID":"ae3f417e-2bae-44dd-973f-5314b6f64972","Type":"ContainerStarted","Data":"f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.137707 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" event={"ID":"70d74e61-4d44-4a6c-8a14-16e131d79e47","Type":"ContainerStarted","Data":"dbd4c9fba01b5efdc30190a223c20ae18c09fc0a5fc704cc9937daff3e4fce36"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.138672 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" event={"ID":"4a636bbb-70b8-4b2a-96c6-94f9edba40cc","Type":"ContainerStarted","Data":"0cf5199ed100e3e98a5cc08713ff630ccd68a9a593e15ef496c1e86b0c0a6e54"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.147980 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2rjrk" podStartSLOduration=129.147966605 podStartE2EDuration="2m9.147966605s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:34.121056323 +0000 UTC m=+154.423570722" watchObservedRunningTime="2026-02-02 13:03:34.147966605 +0000 UTC m=+154.450480994" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.149811 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.152360 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.65233386 +0000 UTC m=+154.954848249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.164308 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" event={"ID":"f028d7b1-e474-45f8-9c4e-d1b2322175c7","Type":"ContainerStarted","Data":"cc6bc4adfcd7a74fe555803cac666054d590cd120b7c6deceeebb28df5ac858c"} Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.165258 4721 patch_prober.go:28] interesting pod/downloads-7954f5f757-zt9ng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.165296 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zt9ng" podUID="af02a63f-5e62-47ff-baf5-1dc1e95dc1ad" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.177755 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-csktx"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.180556 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.190445 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jrfhj" podStartSLOduration=129.190429679 podStartE2EDuration="2m9.190429679s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:34.189684017 +0000 UTC m=+154.492198406" watchObservedRunningTime="2026-02-02 13:03:34.190429679 +0000 UTC m=+154.492944068" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.250927 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-2dsnx" podStartSLOduration=129.250911502 podStartE2EDuration="2m9.250911502s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:34.233617792 +0000 UTC m=+154.536132181" watchObservedRunningTime="2026-02-02 13:03:34.250911502 +0000 UTC m=+154.553425891" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.252475 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c958f"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.255409 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.275372 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.775352564 +0000 UTC m=+155.077866953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.296885 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7gtg4"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.302272 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.307097 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kh9ph"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.351458 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-td8sr" podStartSLOduration=129.351430779 podStartE2EDuration="2m9.351430779s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:34.315756048 +0000 UTC m=+154.618270447" watchObservedRunningTime="2026-02-02 13:03:34.351430779 +0000 UTC m=+154.653945168" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.357172 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.357348 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.857320606 +0000 UTC m=+155.159834995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.357551 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.357886 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.857879581 +0000 UTC m=+155.160393970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.364835 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.399262 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-r47km"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.401527 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.403129 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.404427 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:34 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:34 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:34 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.404463 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.450727 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c68d5"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.461948 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk"] Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.462736 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.463139 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:34.963124663 +0000 UTC m=+155.265639042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: W0202 13:03:34.484871 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6104d27e_fefa_4e2a_9b9e_62013c96f664.slice/crio-c5040c8797ef39726b4af64827b6a2417ffe50aa5ddb4db88a844c65aeb34b5b WatchSource:0}: Error finding container c5040c8797ef39726b4af64827b6a2417ffe50aa5ddb4db88a844c65aeb34b5b: Status 404 returned error can't find the container with id c5040c8797ef39726b4af64827b6a2417ffe50aa5ddb4db88a844c65aeb34b5b Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.487755 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg"] Feb 02 13:03:34 crc kubenswrapper[4721]: W0202 13:03:34.528471 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f85b66_5a30_4bef_909c_26750b18e72d.slice/crio-a8c8c985d386766f9c9bda3485f99c69a2946aa1ec8a32de2d8987042f9826c8 WatchSource:0}: Error finding container a8c8c985d386766f9c9bda3485f99c69a2946aa1ec8a32de2d8987042f9826c8: Status 404 returned error can't find the container with id a8c8c985d386766f9c9bda3485f99c69a2946aa1ec8a32de2d8987042f9826c8 Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.564637 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.565077 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.06504672 +0000 UTC m=+155.367561109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.665998 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.666670 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.166651978 +0000 UTC m=+155.469166377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.767690 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.772434 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.272395083 +0000 UTC m=+155.574909492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.870357 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.870737 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.370722669 +0000 UTC m=+155.673237048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.979726 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:34 crc kubenswrapper[4721]: E0202 13:03:34.980087 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.480056376 +0000 UTC m=+155.782570765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.980524 4721 csr.go:261] certificate signing request csr-46nkk is approved, waiting to be issued Feb 02 13:03:34 crc kubenswrapper[4721]: I0202 13:03:34.995844 4721 csr.go:257] certificate signing request csr-46nkk is issued Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.080378 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.080653 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.580639755 +0000 UTC m=+155.883154144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.176712 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" event={"ID":"873a8c0c-9da4-4619-9ebf-7a327eb22b7e","Type":"ContainerStarted","Data":"e5df09a9819f06ce199926abbd018fc8f1a3ae0cbae66765ccb3d7e1c3a1a81c"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.176757 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" event={"ID":"873a8c0c-9da4-4619-9ebf-7a327eb22b7e","Type":"ContainerStarted","Data":"a3ba63e33fb814de0a45f3a1bb2d277752ed10e110e4a55af2fb1ec65495a8cc"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.185554 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.186041 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.68602415 +0000 UTC m=+155.988538539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.189909 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" event={"ID":"a0a1d56e-00d0-4e88-bdfb-461578e327e6","Type":"ContainerStarted","Data":"c3b326b65fcfbf8fe7658c5e452941da2a6ae05b45d0bb4cab349594633e9799"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.195369 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" event={"ID":"6dc5b220-3e84-4a0a-9f7a-f27a007436f6","Type":"ContainerStarted","Data":"89c981d0d28206453ee7b4dd2abc982c40975e7768fcebc94fb10e119811b66b"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.208565 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" podStartSLOduration=130.208545528 podStartE2EDuration="2m10.208545528s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.207441057 +0000 UTC m=+155.509955446" watchObservedRunningTime="2026-02-02 13:03:35.208545528 +0000 UTC m=+155.511059917" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.209669 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" event={"ID":"46f85b66-5a30-4bef-909c-26750b18e72d","Type":"ContainerStarted","Data":"c99ca5207922d482d12e385981ae8866eee591181a2eadd21b9431af1880d536"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.209701 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" event={"ID":"46f85b66-5a30-4bef-909c-26750b18e72d","Type":"ContainerStarted","Data":"a8c8c985d386766f9c9bda3485f99c69a2946aa1ec8a32de2d8987042f9826c8"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.222544 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" event={"ID":"ab926544-a708-445a-aaf9-0e3ad4593676","Type":"ContainerStarted","Data":"3a484d5baeedc1c3a99091c629d18674bdc023dba7fc279b1b427b38d4511479"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.222616 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" event={"ID":"ab926544-a708-445a-aaf9-0e3ad4593676","Type":"ContainerStarted","Data":"eee1366e170e03c54c022bcac6ae246546ca4799522f5652a6e19c5f099c5abf"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.230685 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" event={"ID":"ab4249b9-1751-45d6-be3f-58668c4542bd","Type":"ContainerStarted","Data":"fca0e3a431ad3dc1fd30cd813ef83109ac7dbc0572362f94da1481a13f4a7825"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.231452 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdzwk" podStartSLOduration=130.231435896 podStartE2EDuration="2m10.231435896s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.229031429 +0000 UTC m=+155.531545828" watchObservedRunningTime="2026-02-02 13:03:35.231435896 +0000 UTC m=+155.533950285" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.246641 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" event={"ID":"2c9074bc-889d-4ce7-a250-6fc5984703e0","Type":"ContainerStarted","Data":"6e95f003df211d09b9562e86431541c3b7c3e84c41d01ea470d07b5cb914180b"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.246678 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.248675 4721 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zcf44 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.248768 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.255658 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-r47km" event={"ID":"37b1d658-0b12-4afd-9b42-9f54f553d432","Type":"ContainerStarted","Data":"15d3f8ae4d864c8501535fef147df2c2bf89fadd84ee90b3b2d6e8a544d6d5c5"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.255707 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-r47km" event={"ID":"37b1d658-0b12-4afd-9b42-9f54f553d432","Type":"ContainerStarted","Data":"3c3b769c8caaa9dd65a847a67f93dcdcb50715c187002d82bf642870ae1ad65f"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.290941 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" event={"ID":"64fc7a32-5852-4e03-b1b7-1663f7f52b65","Type":"ContainerStarted","Data":"744cb2b7251373e328f634242bd5951588b52ac275861886686451996b57f157"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.296242 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.303970 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.80392345 +0000 UTC m=+156.106437839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.304428 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.304782 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.804760934 +0000 UTC m=+156.107275323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.305852 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csktx" podStartSLOduration=129.305832404 podStartE2EDuration="2m9.305832404s" podCreationTimestamp="2026-02-02 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.269143854 +0000 UTC m=+155.571658243" watchObservedRunningTime="2026-02-02 13:03:35.305832404 +0000 UTC m=+155.608346793" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.307432 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" podStartSLOduration=130.307424259 podStartE2EDuration="2m10.307424259s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.304803655 +0000 UTC m=+155.607318054" watchObservedRunningTime="2026-02-02 13:03:35.307424259 +0000 UTC m=+155.609938648" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.321176 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kq22p" event={"ID":"53727b9f-9f5f-4f6e-8fa2-a6018c8225f5","Type":"ContainerStarted","Data":"5161b213f903145ea88c0c0561cb8ec55279a296067271b5a1e33b49be714025"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.338963 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-r47km" podStartSLOduration=129.338946052 podStartE2EDuration="2m9.338946052s" podCreationTimestamp="2026-02-02 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.337795569 +0000 UTC m=+155.640309958" watchObservedRunningTime="2026-02-02 13:03:35.338946052 +0000 UTC m=+155.641460441" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.360236 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" event={"ID":"32b31753-6a52-4364-b01f-9d50aeac7c13","Type":"ContainerStarted","Data":"4ff2c99b0f1215e20b5ea256b46b69f44328458fd8c75861e1839a1eeeb1f151"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.373303 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" event={"ID":"c8c9b912-fd75-4f62-bb98-9a37bd5e7ebc","Type":"ContainerStarted","Data":"bcb3b94189b8f6b7f370b3ebefaa653af4936f6edb10b47441866b338bb5d34c"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.404388 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:35 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:35 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:35 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.404462 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.407833 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.408356 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.908282166 +0000 UTC m=+156.210796565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.408659 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.409164 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" podStartSLOduration=130.409117789 podStartE2EDuration="2m10.409117789s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.379632114 +0000 UTC m=+155.682146503" watchObservedRunningTime="2026-02-02 13:03:35.409117789 +0000 UTC m=+155.711632188" Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.410631 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:35.910609792 +0000 UTC m=+156.213124181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.417292 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" event={"ID":"6104d27e-fefa-4e2a-9b9e-62013c96f664","Type":"ContainerStarted","Data":"c5040c8797ef39726b4af64827b6a2417ffe50aa5ddb4db88a844c65aeb34b5b"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.424596 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-78nss" podStartSLOduration=130.424579717 podStartE2EDuration="2m10.424579717s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.421991214 +0000 UTC m=+155.724505603" watchObservedRunningTime="2026-02-02 13:03:35.424579717 +0000 UTC m=+155.727094116" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.432756 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" event={"ID":"3bbe9190-62bb-4079-afa7-adc9e970eae6","Type":"ContainerStarted","Data":"86fd6e993a06d4d91a16ddb7c3a024205e6c4608b70cb619dc6637169574ffab"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.466667 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" event={"ID":"debfaa13-5820-4570-a447-8ef48903144c","Type":"ContainerStarted","Data":"da3c6102abe819addd1522029e3af039e678a12c624cde39aed41a2c321e3ade"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.466708 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" event={"ID":"debfaa13-5820-4570-a447-8ef48903144c","Type":"ContainerStarted","Data":"11762e7d97ad277d747cb94e2e100ba563f76d9f9f428ccfbc766da978836302"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.466718 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" event={"ID":"debfaa13-5820-4570-a447-8ef48903144c","Type":"ContainerStarted","Data":"5c48c7658ea296353132e081ad07eaec773355461c09ff0e4f19c84bf52e6705"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.467287 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.474088 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" podStartSLOduration=130.47405723 podStartE2EDuration="2m10.47405723s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.45538239 +0000 UTC m=+155.757896789" watchObservedRunningTime="2026-02-02 13:03:35.47405723 +0000 UTC m=+155.776571619" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.475223 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kq22p" podStartSLOduration=7.475217262 podStartE2EDuration="7.475217262s" podCreationTimestamp="2026-02-02 13:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.472962758 +0000 UTC m=+155.775477147" watchObservedRunningTime="2026-02-02 13:03:35.475217262 +0000 UTC m=+155.777731651" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.512361 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" event={"ID":"4a107c9e-3f0e-4d4b-8b02-65bda6793a2d","Type":"ContainerStarted","Data":"a6deb52a71363fd4f383c9ee9d1bd67067af795681ba66fe2ff0fb4273539d08"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.512831 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.513057 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.013040903 +0000 UTC m=+156.315555292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.513239 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.513492 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.013486236 +0000 UTC m=+156.316000625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.553483 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" event={"ID":"0a98572f-0fab-4dab-9935-6bf52cdc7fff","Type":"ContainerStarted","Data":"ea214fd18bf76c462f5154b629db887d121191dc24151aed967735447e721aa2"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.559401 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7" event={"ID":"bace5e2a-2c1a-433c-bc00-9121a45aa515","Type":"ContainerStarted","Data":"f2498a2579c452883ea2adc933b84935f1c81166a594286ae8b2ff491993bc1d"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.599009 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mkplf" podStartSLOduration=130.598988888 podStartE2EDuration="2m10.598988888s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.512304073 +0000 UTC m=+155.814818462" watchObservedRunningTime="2026-02-02 13:03:35.598988888 +0000 UTC m=+155.901503277" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.616540 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.617539 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.117521713 +0000 UTC m=+156.420036112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.635442 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l49ld" podStartSLOduration=130.63542336 podStartE2EDuration="2m10.63542336s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.600395068 +0000 UTC m=+155.902909457" watchObservedRunningTime="2026-02-02 13:03:35.63542336 +0000 UTC m=+155.937937739" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.670763 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zg529" event={"ID":"a1dde568-291e-40bf-9df7-18cd5449d0aa","Type":"ContainerStarted","Data":"4dfdfa3de3bd05c9ac15a39f27fdf71821bae33b58be473cf1629df4faa3b249"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.671078 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2jzch" podStartSLOduration=130.671048369 podStartE2EDuration="2m10.671048369s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.670503614 +0000 UTC m=+155.973017993" watchObservedRunningTime="2026-02-02 13:03:35.671048369 +0000 UTC m=+155.973562748" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.672116 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" podStartSLOduration=129.672111799 podStartE2EDuration="2m9.672111799s" podCreationTimestamp="2026-02-02 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.640829104 +0000 UTC m=+155.943343493" watchObservedRunningTime="2026-02-02 13:03:35.672111799 +0000 UTC m=+155.974626188" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.714562 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" event={"ID":"b06f9eba-0e3d-47fb-a386-a166987e78fd","Type":"ContainerStarted","Data":"df15f53ea93e89750e5f950d8d33fff9a9ea7f43c52a31c89a745f8551b59cde"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.715413 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.720465 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.721382 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.221365894 +0000 UTC m=+156.523880283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.734601 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.737059 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qr8r7" podStartSLOduration=130.737044878 podStartE2EDuration="2m10.737044878s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.709270362 +0000 UTC m=+156.011784751" watchObservedRunningTime="2026-02-02 13:03:35.737044878 +0000 UTC m=+156.039559287" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.741818 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" event={"ID":"c5b5487e-8a60-4967-b0f3-1d983c559f8a","Type":"ContainerStarted","Data":"05942b3e4764cef408eca65ef809cb3200501f49bf3319b23e4a403483cf33e1"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.741909 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" event={"ID":"c5b5487e-8a60-4967-b0f3-1d983c559f8a","Type":"ContainerStarted","Data":"b36b044de996d054bd6154612f956bcb38cfaee73c1f49bafe2226cf6641c478"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.742839 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.744760 4721 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-krxdl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.744797 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" podUID="c5b5487e-8a60-4967-b0f3-1d983c559f8a" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.774640 4721 generic.go:334] "Generic (PLEG): container finished" podID="f2a74f12-1bed-4744-9dec-57282d5301eb" containerID="5d044c245a6bf58bd6b2e12977072e9547408c5abc9b4ec2e3376d67a3734b1c" exitCode=0 Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.774763 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" event={"ID":"f2a74f12-1bed-4744-9dec-57282d5301eb","Type":"ContainerDied","Data":"5d044c245a6bf58bd6b2e12977072e9547408c5abc9b4ec2e3376d67a3734b1c"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.778026 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" event={"ID":"cb2244ea-f203-4f66-9a4d-aad5e58a5c46","Type":"ContainerStarted","Data":"94760d753ba4419ca2c51d97fbed3f37ab09e4a9d8196aa4a68c6c65f2ebeeda"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.778091 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" event={"ID":"cb2244ea-f203-4f66-9a4d-aad5e58a5c46","Type":"ContainerStarted","Data":"c7ff8128edb569b53e1fd8002b5f74481b94875605c8b92ab77686d729b43f08"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.778762 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.780929 4721 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bz9nm container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.781567 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" podUID="cb2244ea-f203-4f66-9a4d-aad5e58a5c46" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.812314 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hlnsv" podStartSLOduration=129.81229714 podStartE2EDuration="2m9.81229714s" podCreationTimestamp="2026-02-02 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.810558191 +0000 UTC m=+156.113072580" watchObservedRunningTime="2026-02-02 13:03:35.81229714 +0000 UTC m=+156.114811519" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.812582 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wdm7" podStartSLOduration=130.812578278 podStartE2EDuration="2m10.812578278s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.73780842 +0000 UTC m=+156.040322809" watchObservedRunningTime="2026-02-02 13:03:35.812578278 +0000 UTC m=+156.115092667" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.823982 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.824178 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" event={"ID":"911d6570-6a65-42b8-a562-3e1ccdc8d562","Type":"ContainerStarted","Data":"f0dcb34fe6bc0451437480fb90656ae130cd5c8a2cc6405231d2e07c997c5770"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.824227 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" event={"ID":"911d6570-6a65-42b8-a562-3e1ccdc8d562","Type":"ContainerStarted","Data":"11bf8925ab36a7f8feacf8fe485e2f9f35eafcb7a10b06f41bc5045d87fc04e2"} Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.825364 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.325344959 +0000 UTC m=+156.627859408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.848244 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-zg529" podStartSLOduration=130.848227728 podStartE2EDuration="2m10.848227728s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.847405135 +0000 UTC m=+156.149919524" watchObservedRunningTime="2026-02-02 13:03:35.848227728 +0000 UTC m=+156.150742117" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.876505 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kh9ph" event={"ID":"625494cc-c7b0-4a0a-811c-d4822b1c0acc","Type":"ContainerStarted","Data":"173ea934a18ec19b0c46c01ff6062ba029490ac9f3437ab721e44703df7dde5a"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.898890 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" podStartSLOduration=129.898859492 podStartE2EDuration="2m9.898859492s" podCreationTimestamp="2026-02-02 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.897982027 +0000 UTC m=+156.200496416" watchObservedRunningTime="2026-02-02 13:03:35.898859492 +0000 UTC m=+156.201373881" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.909588 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" event={"ID":"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d","Type":"ContainerStarted","Data":"ab988754b924cad3030308810cdaff3015e74064772da97cdd5f52fe3ae386f0"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.909642 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" event={"ID":"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d","Type":"ContainerStarted","Data":"56cc9472fd9c8c633f44e42ebde3e87948267970cd566223980a9506e2168ebd"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.925440 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:35 crc kubenswrapper[4721]: E0202 13:03:35.926577 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.426562187 +0000 UTC m=+156.729076576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.930294 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" event={"ID":"70d74e61-4d44-4a6c-8a14-16e131d79e47","Type":"ContainerStarted","Data":"6ded78ee58b89f7f490ce23e98391149b41fe9dd20d3ea356745be901f162a02"} Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.936058 4721 patch_prober.go:28] interesting pod/downloads-7954f5f757-zt9ng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.936158 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zt9ng" podUID="af02a63f-5e62-47ff-baf5-1dc1e95dc1ad" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 02 13:03:35 crc kubenswrapper[4721]: I0202 13:03:35.950248 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" podStartSLOduration=129.950233508 podStartE2EDuration="2m9.950233508s" podCreationTimestamp="2026-02-02 13:01:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:35.948551589 +0000 UTC m=+156.251065978" watchObservedRunningTime="2026-02-02 13:03:35.950233508 +0000 UTC m=+156.252747897" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.000855 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-02 12:58:34 +0000 UTC, rotation deadline is 2026-12-02 17:01:42.055177192 +0000 UTC Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.000926 4721 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7275h58m6.054254958s for next certificate rotation Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.018269 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" podStartSLOduration=131.018249194 podStartE2EDuration="2m11.018249194s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:36.017433501 +0000 UTC m=+156.319947890" watchObservedRunningTime="2026-02-02 13:03:36.018249194 +0000 UTC m=+156.320763583" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.030474 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.032975 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.532946871 +0000 UTC m=+156.835461260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.053333 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" podStartSLOduration=131.053317247 podStartE2EDuration="2m11.053317247s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:36.052516285 +0000 UTC m=+156.355030684" watchObservedRunningTime="2026-02-02 13:03:36.053317247 +0000 UTC m=+156.355831636" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.134434 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.134791 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.634778225 +0000 UTC m=+156.937292604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.235867 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.236340 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.736295991 +0000 UTC m=+157.038810380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.338003 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.338401 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.838386613 +0000 UTC m=+157.140901002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.379346 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.379544 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.395566 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:36 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:36 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:36 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.396017 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.438819 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.438982 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.938958202 +0000 UTC m=+157.241472591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.439080 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.439425 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:36.939417174 +0000 UTC m=+157.241931563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.540397 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.540609 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.040560319 +0000 UTC m=+157.343074708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.540899 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.541274 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.04125812 +0000 UTC m=+157.343772509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.642148 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.642366 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.142330672 +0000 UTC m=+157.444845051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.642478 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.642831 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.142816156 +0000 UTC m=+157.445330545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.731490 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-f4l8v" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.744012 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.744219 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.244188438 +0000 UTC m=+157.546702827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.744336 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.744633 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.24462123 +0000 UTC m=+157.547135619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.781977 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z25fz" podStartSLOduration=131.781959738 podStartE2EDuration="2m11.781959738s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:36.102346226 +0000 UTC m=+156.404860615" watchObservedRunningTime="2026-02-02 13:03:36.781959738 +0000 UTC m=+157.084474127" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.845706 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.845870 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.345843067 +0000 UTC m=+157.648357456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.846054 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.846366 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.346352822 +0000 UTC m=+157.648867211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.940173 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" event={"ID":"a0a1d56e-00d0-4e88-bdfb-461578e327e6","Type":"ContainerStarted","Data":"4aaf11806c83c9a8c17f1f6647fcc6636c08cfc5d59a93259cf0b1eb1e64c3ee"} Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.944768 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c958f" event={"ID":"911d6570-6a65-42b8-a562-3e1ccdc8d562","Type":"ContainerStarted","Data":"6c70bed4abe3f6f707760ed133feebd0cc860c4122dfc398c48b54a4187ceeb1"} Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.946213 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" event={"ID":"6dc5b220-3e84-4a0a-9f7a-f27a007436f6","Type":"ContainerStarted","Data":"9da17c9d0725f02292c51b99e9e26c02584a85efbda4a2be69a38b3eae85296c"} Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.946538 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.946665 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.446643832 +0000 UTC m=+157.749158221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.946774 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:36 crc kubenswrapper[4721]: E0202 13:03:36.947110 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.447094826 +0000 UTC m=+157.749609215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.954628 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kh9ph" event={"ID":"625494cc-c7b0-4a0a-811c-d4822b1c0acc","Type":"ContainerStarted","Data":"a7fc4d052f1e7098b1ae4f2ee53b969db52a7df9be37565a5fe65ba3df3a7323"} Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.954675 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kh9ph" event={"ID":"625494cc-c7b0-4a0a-811c-d4822b1c0acc","Type":"ContainerStarted","Data":"4aa2069f7abf95d557c6f6e51977472c9e73803c17d0fc953e81a07accab08ed"} Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.954788 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.962876 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-f5gbr" event={"ID":"2de0eb97-f51f-4468-9a68-eb9d6a7ce40d","Type":"ContainerStarted","Data":"f4cab96224bddd5a8a888b08f3b2b58f17ef1fcad2c8e71a3c9833a79c5f7742"} Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.964644 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" event={"ID":"6104d27e-fefa-4e2a-9b9e-62013c96f664","Type":"ContainerStarted","Data":"db700c1a357f0e0b7fddce98c406153a2f44ea1407caeae97c74bd1efdb90a89"} Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.964697 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" event={"ID":"6104d27e-fefa-4e2a-9b9e-62013c96f664","Type":"ContainerStarted","Data":"b82a346e555b506db84a00ba966ded4574a5f93dde381f4739157aa3264abc19"} Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.966853 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" event={"ID":"f2a74f12-1bed-4744-9dec-57282d5301eb","Type":"ContainerStarted","Data":"b29303ec65ab709b7092cfc60d80e7ffbc12ce0f422d5144218d19a98595025f"} Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.968579 4721 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zcf44 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.968639 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.971408 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz9nm" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.993845 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-w9gmp" podStartSLOduration=131.993826219 podStartE2EDuration="2m11.993826219s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:36.972242688 +0000 UTC m=+157.274757077" watchObservedRunningTime="2026-02-02 13:03:36.993826219 +0000 UTC m=+157.296340608" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.994269 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kh9ph" podStartSLOduration=8.994264531 podStartE2EDuration="8.994264531s" podCreationTimestamp="2026-02-02 13:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:36.991685778 +0000 UTC m=+157.294200167" watchObservedRunningTime="2026-02-02 13:03:36.994264531 +0000 UTC m=+157.296778910" Feb 02 13:03:36 crc kubenswrapper[4721]: I0202 13:03:36.997326 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.048054 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.049698 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.549681801 +0000 UTC m=+157.852196190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.103557 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-c68d5" podStartSLOduration=132.103537007 podStartE2EDuration="2m12.103537007s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:37.091936928 +0000 UTC m=+157.394451317" watchObservedRunningTime="2026-02-02 13:03:37.103537007 +0000 UTC m=+157.406051396" Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.150851 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.151329 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.65131446 +0000 UTC m=+157.953828849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.247713 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" podStartSLOduration=132.24769605 podStartE2EDuration="2m12.24769605s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:37.197503108 +0000 UTC m=+157.500017497" watchObservedRunningTime="2026-02-02 13:03:37.24769605 +0000 UTC m=+157.550210439" Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.253497 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.253751 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.753738252 +0000 UTC m=+158.056252641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.354428 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.355130 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.855107673 +0000 UTC m=+158.157622062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.398869 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:37 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:37 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:37 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.399181 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.445911 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.455333 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.455616 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:37.95559961 +0000 UTC m=+158.258113999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.504211 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-krxdl" Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.556874 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.557226 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.057214248 +0000 UTC m=+158.359728637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.658876 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.659058 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.159028542 +0000 UTC m=+158.461542931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.659188 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.659744 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.159728142 +0000 UTC m=+158.462242531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.756965 4721 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.760943 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.761194 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.261152855 +0000 UTC m=+158.563667244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.761373 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.761811 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.261800474 +0000 UTC m=+158.564314853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.862238 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.862404 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.362373072 +0000 UTC m=+158.664887471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.862555 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.862842 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.362828975 +0000 UTC m=+158.665343364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.964089 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.964337 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.464300899 +0000 UTC m=+158.766815288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.964522 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:37 crc kubenswrapper[4721]: E0202 13:03:37.964902 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.464892247 +0000 UTC m=+158.767406636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.974488 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" event={"ID":"6dc5b220-3e84-4a0a-9f7a-f27a007436f6","Type":"ContainerStarted","Data":"3fa8c76009aefe1c90c623918380b724071582b4b9d3b5b3d8840a87e79ce2a8"} Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.974557 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" event={"ID":"6dc5b220-3e84-4a0a-9f7a-f27a007436f6","Type":"ContainerStarted","Data":"3fd192519c55df1c53965318280073b4ab1c355b4f16fb01f21868f30bc94dd3"} Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.974567 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" event={"ID":"6dc5b220-3e84-4a0a-9f7a-f27a007436f6","Type":"ContainerStarted","Data":"e26e926277d71aad91d7fb0a0961d04e8e299605fb44b3a37f74cbbca6b1a737"} Feb 02 13:03:37 crc kubenswrapper[4721]: I0202 13:03:37.982025 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-p9lvq" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.007971 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7gtg4" podStartSLOduration=10.007951926 podStartE2EDuration="10.007951926s" podCreationTimestamp="2026-02-02 13:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:38.000212277 +0000 UTC m=+158.302726676" watchObservedRunningTime="2026-02-02 13:03:38.007951926 +0000 UTC m=+158.310466315" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.065411 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:38 crc kubenswrapper[4721]: E0202 13:03:38.066197 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.566179955 +0000 UTC m=+158.868694344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.158166 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ftf6s"] Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.159123 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.163173 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.166930 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:38 crc kubenswrapper[4721]: E0202 13:03:38.167371 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.667357261 +0000 UTC m=+158.969871650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.169336 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ftf6s"] Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.268049 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:38 crc kubenswrapper[4721]: E0202 13:03:38.268318 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.76828679 +0000 UTC m=+159.070801189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.268834 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srrj7\" (UniqueName: \"kubernetes.io/projected/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-kube-api-access-srrj7\") pod \"community-operators-ftf6s\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.269014 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-catalog-content\") pod \"community-operators-ftf6s\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.269194 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-utilities\") pod \"community-operators-ftf6s\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.269336 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:38 crc kubenswrapper[4721]: E0202 13:03:38.269725 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.76971029 +0000 UTC m=+159.072224679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.355240 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s2tcj"] Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.356768 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.358780 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.370519 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.370731 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2tcj"] Feb 02 13:03:38 crc kubenswrapper[4721]: E0202 13:03:38.370846 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.870754562 +0000 UTC m=+159.173268941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.370967 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srrj7\" (UniqueName: \"kubernetes.io/projected/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-kube-api-access-srrj7\") pod \"community-operators-ftf6s\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.371138 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-catalog-content\") pod \"community-operators-ftf6s\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.371275 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-utilities\") pod \"community-operators-ftf6s\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.371473 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:38 crc kubenswrapper[4721]: E0202 13:03:38.371792 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.871775632 +0000 UTC m=+159.174290021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.371810 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-catalog-content\") pod \"community-operators-ftf6s\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.371887 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-utilities\") pod \"community-operators-ftf6s\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.393013 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srrj7\" (UniqueName: \"kubernetes.io/projected/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-kube-api-access-srrj7\") pod \"community-operators-ftf6s\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.396432 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:38 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:38 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:38 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.396504 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.472293 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:38 crc kubenswrapper[4721]: E0202 13:03:38.472398 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.972373201 +0000 UTC m=+159.274887590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.472541 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-utilities\") pod \"certified-operators-s2tcj\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.472590 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgmqw\" (UniqueName: \"kubernetes.io/projected/c32d34a1-8dd8-435d-9491-748392c25b97-kube-api-access-bgmqw\") pod \"certified-operators-s2tcj\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.472638 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-catalog-content\") pod \"certified-operators-s2tcj\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.472691 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:38 crc kubenswrapper[4721]: E0202 13:03:38.473004 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 13:03:38.972993298 +0000 UTC m=+159.275507687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-wlhhk" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.474007 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.555199 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-95btx"] Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.556462 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.568865 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95btx"] Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.573449 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.573740 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-utilities\") pod \"certified-operators-s2tcj\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.573793 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgmqw\" (UniqueName: \"kubernetes.io/projected/c32d34a1-8dd8-435d-9491-748392c25b97-kube-api-access-bgmqw\") pod \"certified-operators-s2tcj\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.573838 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-catalog-content\") pod \"certified-operators-s2tcj\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.574345 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-catalog-content\") pod \"certified-operators-s2tcj\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: E0202 13:03:38.574426 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 13:03:39.074408852 +0000 UTC m=+159.376923241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.574673 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-utilities\") pod \"certified-operators-s2tcj\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.607772 4721 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-02T13:03:37.757232754Z","Handler":null,"Name":""} Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.613693 4721 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.613747 4721 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.627731 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgmqw\" (UniqueName: \"kubernetes.io/projected/c32d34a1-8dd8-435d-9491-748392c25b97-kube-api-access-bgmqw\") pod \"certified-operators-s2tcj\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.672762 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.677800 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8cfp\" (UniqueName: \"kubernetes.io/projected/3275da10-006d-43c9-bdd6-46282b8ac9d1-kube-api-access-q8cfp\") pod \"community-operators-95btx\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.677866 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-catalog-content\") pod \"community-operators-95btx\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.677882 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-utilities\") pod \"community-operators-95btx\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.677924 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.683542 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.683578 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.748913 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ftf6s"] Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.752155 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-wlhhk\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.774046 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hpqtk"] Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.777669 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.778726 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.778987 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpqtk"] Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.779009 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8cfp\" (UniqueName: \"kubernetes.io/projected/3275da10-006d-43c9-bdd6-46282b8ac9d1-kube-api-access-q8cfp\") pod \"community-operators-95btx\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.779206 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-catalog-content\") pod \"community-operators-95btx\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.779242 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-utilities\") pod \"community-operators-95btx\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.779845 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-utilities\") pod \"community-operators-95btx\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.780984 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-catalog-content\") pod \"community-operators-95btx\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.805823 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8cfp\" (UniqueName: \"kubernetes.io/projected/3275da10-006d-43c9-bdd6-46282b8ac9d1-kube-api-access-q8cfp\") pod \"community-operators-95btx\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.811860 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.818421 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.880046 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-utilities\") pod \"certified-operators-hpqtk\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.880292 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-catalog-content\") pod \"certified-operators-hpqtk\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.880333 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdlcg\" (UniqueName: \"kubernetes.io/projected/5b5702e4-96dd-479b-871a-d69bfdba91e1-kube-api-access-jdlcg\") pod \"certified-operators-hpqtk\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.945251 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s2tcj"] Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.950997 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95btx" Feb 02 13:03:38 crc kubenswrapper[4721]: W0202 13:03:38.959002 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32d34a1_8dd8_435d_9491_748392c25b97.slice/crio-607b84235315a7323834248f190f48d1f32164d2db07ef1dcc79f0ce6457a6d0 WatchSource:0}: Error finding container 607b84235315a7323834248f190f48d1f32164d2db07ef1dcc79f0ce6457a6d0: Status 404 returned error can't find the container with id 607b84235315a7323834248f190f48d1f32164d2db07ef1dcc79f0ce6457a6d0 Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.981914 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-utilities\") pod \"certified-operators-hpqtk\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.981955 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-catalog-content\") pod \"certified-operators-hpqtk\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.981978 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdlcg\" (UniqueName: \"kubernetes.io/projected/5b5702e4-96dd-479b-871a-d69bfdba91e1-kube-api-access-jdlcg\") pod \"certified-operators-hpqtk\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.982405 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-catalog-content\") pod \"certified-operators-hpqtk\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.982871 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-utilities\") pod \"certified-operators-hpqtk\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.989476 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftf6s" event={"ID":"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b","Type":"ContainerStarted","Data":"c4262d6d2388653a81a9cb645a74803128568514778cf935a45ab36a4268cbc6"} Feb 02 13:03:38 crc kubenswrapper[4721]: I0202 13:03:38.989516 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftf6s" event={"ID":"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b","Type":"ContainerStarted","Data":"ff138d00e2dec0f6fe53dd62f78ed24adffd461fe550704795a81bdea55a7066"} Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.011615 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdlcg\" (UniqueName: \"kubernetes.io/projected/5b5702e4-96dd-479b-871a-d69bfdba91e1-kube-api-access-jdlcg\") pod \"certified-operators-hpqtk\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.016183 4721 generic.go:334] "Generic (PLEG): container finished" podID="873a8c0c-9da4-4619-9ebf-7a327eb22b7e" containerID="e5df09a9819f06ce199926abbd018fc8f1a3ae0cbae66765ccb3d7e1c3a1a81c" exitCode=0 Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.016272 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" event={"ID":"873a8c0c-9da4-4619-9ebf-7a327eb22b7e","Type":"ContainerDied","Data":"e5df09a9819f06ce199926abbd018fc8f1a3ae0cbae66765ccb3d7e1c3a1a81c"} Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.018827 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2tcj" event={"ID":"c32d34a1-8dd8-435d-9491-748392c25b97","Type":"ContainerStarted","Data":"607b84235315a7323834248f190f48d1f32164d2db07ef1dcc79f0ce6457a6d0"} Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.030744 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fgg7h" Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.098692 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wlhhk"] Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.112309 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.188348 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95btx"] Feb 02 13:03:39 crc kubenswrapper[4721]: W0202 13:03:39.224344 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3275da10_006d_43c9_bdd6_46282b8ac9d1.slice/crio-d1d7313e7e9488250223e5d45503d08f43af174101c199e67ce71331531c040b WatchSource:0}: Error finding container d1d7313e7e9488250223e5d45503d08f43af174101c199e67ce71331531c040b: Status 404 returned error can't find the container with id d1d7313e7e9488250223e5d45503d08f43af174101c199e67ce71331531c040b Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.310785 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hpqtk"] Feb 02 13:03:39 crc kubenswrapper[4721]: W0202 13:03:39.317428 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5702e4_96dd_479b_871a_d69bfdba91e1.slice/crio-cee4e973cd74c01297b1880257bab371fd0bb8e104dad9318438c355e90ddc2e WatchSource:0}: Error finding container cee4e973cd74c01297b1880257bab371fd0bb8e104dad9318438c355e90ddc2e: Status 404 returned error can't find the container with id cee4e973cd74c01297b1880257bab371fd0bb8e104dad9318438c355e90ddc2e Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.395201 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:39 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:39 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:39 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:39 crc kubenswrapper[4721]: I0202 13:03:39.395258 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.034655 4721 generic.go:334] "Generic (PLEG): container finished" podID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerID="c4262d6d2388653a81a9cb645a74803128568514778cf935a45ab36a4268cbc6" exitCode=0 Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.034728 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftf6s" event={"ID":"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b","Type":"ContainerDied","Data":"c4262d6d2388653a81a9cb645a74803128568514778cf935a45ab36a4268cbc6"} Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.037004 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" event={"ID":"e5f7f80a-15ef-47b9-9e1e-325066df7897","Type":"ContainerStarted","Data":"0bd3bc5f864672ee0d853f714bcf1118de69c8c71858f5b65ce239a92ed34811"} Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.037026 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" event={"ID":"e5f7f80a-15ef-47b9-9e1e-325066df7897","Type":"ContainerStarted","Data":"05e1b0050534ad29187ccb842c7d704d41289dc2c02dfe3c8fae4b1bff20a647"} Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.037594 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.037886 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.044103 4721 generic.go:334] "Generic (PLEG): container finished" podID="c32d34a1-8dd8-435d-9491-748392c25b97" containerID="70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d" exitCode=0 Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.044181 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2tcj" event={"ID":"c32d34a1-8dd8-435d-9491-748392c25b97","Type":"ContainerDied","Data":"70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d"} Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.047964 4721 generic.go:334] "Generic (PLEG): container finished" podID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerID="e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0" exitCode=0 Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.048034 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95btx" event={"ID":"3275da10-006d-43c9-bdd6-46282b8ac9d1","Type":"ContainerDied","Data":"e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0"} Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.048058 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95btx" event={"ID":"3275da10-006d-43c9-bdd6-46282b8ac9d1","Type":"ContainerStarted","Data":"d1d7313e7e9488250223e5d45503d08f43af174101c199e67ce71331531c040b"} Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.051424 4721 generic.go:334] "Generic (PLEG): container finished" podID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerID="7fd0cce05f07f725f6f879eba319ea2fa9bcc093917d1f8c17be638fe958ae8a" exitCode=0 Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.051667 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpqtk" event={"ID":"5b5702e4-96dd-479b-871a-d69bfdba91e1","Type":"ContainerDied","Data":"7fd0cce05f07f725f6f879eba319ea2fa9bcc093917d1f8c17be638fe958ae8a"} Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.051737 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpqtk" event={"ID":"5b5702e4-96dd-479b-871a-d69bfdba91e1","Type":"ContainerStarted","Data":"cee4e973cd74c01297b1880257bab371fd0bb8e104dad9318438c355e90ddc2e"} Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.142787 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" podStartSLOduration=135.142746737 podStartE2EDuration="2m15.142746737s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:40.137147458 +0000 UTC m=+160.439661847" watchObservedRunningTime="2026-02-02 13:03:40.142746737 +0000 UTC m=+160.445261126" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.175383 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.175526 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.182006 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.320650 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.359700 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-75gx6"] Feb 02 13:03:40 crc kubenswrapper[4721]: E0202 13:03:40.359964 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873a8c0c-9da4-4619-9ebf-7a327eb22b7e" containerName="collect-profiles" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.359979 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="873a8c0c-9da4-4619-9ebf-7a327eb22b7e" containerName="collect-profiles" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.360416 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="873a8c0c-9da4-4619-9ebf-7a327eb22b7e" containerName="collect-profiles" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.361474 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.369312 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.376451 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-75gx6"] Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.394827 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:40 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:40 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:40 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.394933 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.409519 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-config-volume\") pod \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.409628 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnzzr\" (UniqueName: \"kubernetes.io/projected/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-kube-api-access-lnzzr\") pod \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.409669 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-secret-volume\") pod \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\" (UID: \"873a8c0c-9da4-4619-9ebf-7a327eb22b7e\") " Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.409833 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsbmd\" (UniqueName: \"kubernetes.io/projected/b11b9dcc-682e-48c6-9948-78aafcaf9e36-kube-api-access-gsbmd\") pod \"redhat-marketplace-75gx6\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.409898 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-utilities\") pod \"redhat-marketplace-75gx6\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.409917 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-catalog-content\") pod \"redhat-marketplace-75gx6\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.410784 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-config-volume" (OuterVolumeSpecName: "config-volume") pod "873a8c0c-9da4-4619-9ebf-7a327eb22b7e" (UID: "873a8c0c-9da4-4619-9ebf-7a327eb22b7e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.416463 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-kube-api-access-lnzzr" (OuterVolumeSpecName: "kube-api-access-lnzzr") pod "873a8c0c-9da4-4619-9ebf-7a327eb22b7e" (UID: "873a8c0c-9da4-4619-9ebf-7a327eb22b7e"). InnerVolumeSpecName "kube-api-access-lnzzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.423340 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "873a8c0c-9da4-4619-9ebf-7a327eb22b7e" (UID: "873a8c0c-9da4-4619-9ebf-7a327eb22b7e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.427471 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.511402 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsbmd\" (UniqueName: \"kubernetes.io/projected/b11b9dcc-682e-48c6-9948-78aafcaf9e36-kube-api-access-gsbmd\") pod \"redhat-marketplace-75gx6\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.511495 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-utilities\") pod \"redhat-marketplace-75gx6\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.511517 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-catalog-content\") pod \"redhat-marketplace-75gx6\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.511584 4721 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.511598 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnzzr\" (UniqueName: \"kubernetes.io/projected/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-kube-api-access-lnzzr\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.511607 4721 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/873a8c0c-9da4-4619-9ebf-7a327eb22b7e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.512039 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-catalog-content\") pod \"redhat-marketplace-75gx6\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.512188 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-utilities\") pod \"redhat-marketplace-75gx6\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.534930 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsbmd\" (UniqueName: \"kubernetes.io/projected/b11b9dcc-682e-48c6-9948-78aafcaf9e36-kube-api-access-gsbmd\") pod \"redhat-marketplace-75gx6\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.682248 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.774292 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7jcv9"] Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.778427 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.783223 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jcv9"] Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.815136 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-catalog-content\") pod \"redhat-marketplace-7jcv9\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.815199 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-utilities\") pod \"redhat-marketplace-7jcv9\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.815231 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24wc6\" (UniqueName: \"kubernetes.io/projected/d0f9579e-e58d-40b4-82c4-83111bfa9735-kube-api-access-24wc6\") pod \"redhat-marketplace-7jcv9\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.870030 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.870966 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.875835 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.876107 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.885776 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-75gx6"] Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.895146 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.917408 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-utilities\") pod \"redhat-marketplace-7jcv9\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.917451 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24wc6\" (UniqueName: \"kubernetes.io/projected/d0f9579e-e58d-40b4-82c4-83111bfa9735-kube-api-access-24wc6\") pod \"redhat-marketplace-7jcv9\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.917525 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/598f8872-99ec-4855-9124-07a34b4ceaf9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"598f8872-99ec-4855-9124-07a34b4ceaf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.917554 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-catalog-content\") pod \"redhat-marketplace-7jcv9\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.917584 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/598f8872-99ec-4855-9124-07a34b4ceaf9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"598f8872-99ec-4855-9124-07a34b4ceaf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.917999 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-catalog-content\") pod \"redhat-marketplace-7jcv9\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.918011 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-utilities\") pod \"redhat-marketplace-7jcv9\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:40 crc kubenswrapper[4721]: I0202 13:03:40.938277 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24wc6\" (UniqueName: \"kubernetes.io/projected/d0f9579e-e58d-40b4-82c4-83111bfa9735-kube-api-access-24wc6\") pod \"redhat-marketplace-7jcv9\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.005844 4721 patch_prober.go:28] interesting pod/downloads-7954f5f757-zt9ng container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.005897 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zt9ng" podUID="af02a63f-5e62-47ff-baf5-1dc1e95dc1ad" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.005860 4721 patch_prober.go:28] interesting pod/downloads-7954f5f757-zt9ng container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.005952 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zt9ng" podUID="af02a63f-5e62-47ff-baf5-1dc1e95dc1ad" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.007375 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.008619 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.012196 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.012405 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.015103 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.018318 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/598f8872-99ec-4855-9124-07a34b4ceaf9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"598f8872-99ec-4855-9124-07a34b4ceaf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.018385 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8d8781e-3259-4d55-b0d2-968979b5cd99-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b8d8781e-3259-4d55-b0d2-968979b5cd99\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.018427 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8d8781e-3259-4d55-b0d2-968979b5cd99-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b8d8781e-3259-4d55-b0d2-968979b5cd99\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.018468 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/598f8872-99ec-4855-9124-07a34b4ceaf9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"598f8872-99ec-4855-9124-07a34b4ceaf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.018490 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/598f8872-99ec-4855-9124-07a34b4ceaf9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"598f8872-99ec-4855-9124-07a34b4ceaf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.040364 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/598f8872-99ec-4855-9124-07a34b4ceaf9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"598f8872-99ec-4855-9124-07a34b4ceaf9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.065224 4721 generic.go:334] "Generic (PLEG): container finished" podID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerID="d98e8c3180feeb272dbc337ede325b3ee8bdf7c11b2445546d5a7351f1d071c3" exitCode=0 Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.065314 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75gx6" event={"ID":"b11b9dcc-682e-48c6-9948-78aafcaf9e36","Type":"ContainerDied","Data":"d98e8c3180feeb272dbc337ede325b3ee8bdf7c11b2445546d5a7351f1d071c3"} Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.065342 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75gx6" event={"ID":"b11b9dcc-682e-48c6-9948-78aafcaf9e36","Type":"ContainerStarted","Data":"5628d04181a13e0213caa7a951b015bba8003374b2bb6f608199a4eba95c3b17"} Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.067129 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" event={"ID":"873a8c0c-9da4-4619-9ebf-7a327eb22b7e","Type":"ContainerDied","Data":"a3ba63e33fb814de0a45f3a1bb2d277752ed10e110e4a55af2fb1ec65495a8cc"} Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.067160 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3ba63e33fb814de0a45f3a1bb2d277752ed10e110e4a55af2fb1ec65495a8cc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.067236 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.072449 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-kcw66" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.112622 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.113468 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.114334 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.115274 4721 patch_prober.go:28] interesting pod/console-f9d7485db-2dsnx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.115313 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-2dsnx" podUID="ae3f417e-2bae-44dd-973f-5314b6f64972" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.121477 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8d8781e-3259-4d55-b0d2-968979b5cd99-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b8d8781e-3259-4d55-b0d2-968979b5cd99\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.121618 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8d8781e-3259-4d55-b0d2-968979b5cd99-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b8d8781e-3259-4d55-b0d2-968979b5cd99\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.121703 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8d8781e-3259-4d55-b0d2-968979b5cd99-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b8d8781e-3259-4d55-b0d2-968979b5cd99\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.179382 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8d8781e-3259-4d55-b0d2-968979b5cd99-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b8d8781e-3259-4d55-b0d2-968979b5cd99\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.206587 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.323490 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.378746 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pm5t7"] Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.395518 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.396286 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.399912 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.449054 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-utilities\") pod \"redhat-operators-pm5t7\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.449323 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-catalog-content\") pod \"redhat-operators-pm5t7\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.449353 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk875\" (UniqueName: \"kubernetes.io/projected/b97707af-edd5-4907-9459-615b32a005e6-kube-api-access-mk875\") pod \"redhat-operators-pm5t7\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.451488 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pm5t7"] Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.456357 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:41 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:41 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:41 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.456428 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.555743 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-catalog-content\") pod \"redhat-operators-pm5t7\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.555788 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk875\" (UniqueName: \"kubernetes.io/projected/b97707af-edd5-4907-9459-615b32a005e6-kube-api-access-mk875\") pod \"redhat-operators-pm5t7\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.555858 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-utilities\") pod \"redhat-operators-pm5t7\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.556385 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-utilities\") pod \"redhat-operators-pm5t7\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.556605 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-catalog-content\") pod \"redhat-operators-pm5t7\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.593349 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk875\" (UniqueName: \"kubernetes.io/projected/b97707af-edd5-4907-9459-615b32a005e6-kube-api-access-mk875\") pod \"redhat-operators-pm5t7\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.641729 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jcv9"] Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.742603 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.752880 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.789138 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xqxjr"] Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.790110 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.811567 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.834272 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqxjr"] Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.840128 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 13:03:41 crc kubenswrapper[4721]: W0202 13:03:41.892284 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb8d8781e_3259_4d55_b0d2_968979b5cd99.slice/crio-1082372af7a0adcfdeb24cb7893016cd78e285acfcc8814c2bd902fe4631a01a WatchSource:0}: Error finding container 1082372af7a0adcfdeb24cb7893016cd78e285acfcc8814c2bd902fe4631a01a: Status 404 returned error can't find the container with id 1082372af7a0adcfdeb24cb7893016cd78e285acfcc8814c2bd902fe4631a01a Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.962067 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-catalog-content\") pod \"redhat-operators-xqxjr\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.962193 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-utilities\") pod \"redhat-operators-xqxjr\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:41 crc kubenswrapper[4721]: I0202 13:03:41.962236 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8zdz\" (UniqueName: \"kubernetes.io/projected/fca822b5-78c8-47d9-9cc5-266118a2b5aa-kube-api-access-t8zdz\") pod \"redhat-operators-xqxjr\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.065411 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-catalog-content\") pod \"redhat-operators-xqxjr\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.065822 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-utilities\") pod \"redhat-operators-xqxjr\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.065854 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8zdz\" (UniqueName: \"kubernetes.io/projected/fca822b5-78c8-47d9-9cc5-266118a2b5aa-kube-api-access-t8zdz\") pod \"redhat-operators-xqxjr\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.066772 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-catalog-content\") pod \"redhat-operators-xqxjr\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.067182 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-utilities\") pod \"redhat-operators-xqxjr\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.099454 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jcv9" event={"ID":"d0f9579e-e58d-40b4-82c4-83111bfa9735","Type":"ContainerStarted","Data":"3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e"} Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.099512 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jcv9" event={"ID":"d0f9579e-e58d-40b4-82c4-83111bfa9735","Type":"ContainerStarted","Data":"00041994ab43d697d41af55566c6a0ac8e00b0660330d7b33111225ed94d785c"} Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.100650 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8zdz\" (UniqueName: \"kubernetes.io/projected/fca822b5-78c8-47d9-9cc5-266118a2b5aa-kube-api-access-t8zdz\") pod \"redhat-operators-xqxjr\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.115413 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b8d8781e-3259-4d55-b0d2-968979b5cd99","Type":"ContainerStarted","Data":"1082372af7a0adcfdeb24cb7893016cd78e285acfcc8814c2bd902fe4631a01a"} Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.117839 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"598f8872-99ec-4855-9124-07a34b4ceaf9","Type":"ContainerStarted","Data":"e681de4a8185193feb8aeae9c251e9c185dfaa6492e65f34c92fdd330062b5b0"} Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.211840 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.283363 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pm5t7"] Feb 02 13:03:42 crc kubenswrapper[4721]: W0202 13:03:42.373951 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb97707af_edd5_4907_9459_615b32a005e6.slice/crio-731af821f39e85a65313814ab808bf2c6795132d15116e8c0a34a91225b2d5b6 WatchSource:0}: Error finding container 731af821f39e85a65313814ab808bf2c6795132d15116e8c0a34a91225b2d5b6: Status 404 returned error can't find the container with id 731af821f39e85a65313814ab808bf2c6795132d15116e8c0a34a91225b2d5b6 Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.396966 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:42 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:42 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:42 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.397044 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:42 crc kubenswrapper[4721]: I0202 13:03:42.761842 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqxjr"] Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.127790 4721 generic.go:334] "Generic (PLEG): container finished" podID="b97707af-edd5-4907-9459-615b32a005e6" containerID="41599e7535f02f311fe8e5965707307ae8f5502aec8ceadc6ba6ac29d4504579" exitCode=0 Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.128083 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm5t7" event={"ID":"b97707af-edd5-4907-9459-615b32a005e6","Type":"ContainerDied","Data":"41599e7535f02f311fe8e5965707307ae8f5502aec8ceadc6ba6ac29d4504579"} Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.128141 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm5t7" event={"ID":"b97707af-edd5-4907-9459-615b32a005e6","Type":"ContainerStarted","Data":"731af821f39e85a65313814ab808bf2c6795132d15116e8c0a34a91225b2d5b6"} Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.141820 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b8d8781e-3259-4d55-b0d2-968979b5cd99","Type":"ContainerStarted","Data":"ad974937e7efafb3ad0f3ccfafa676f54039c93d4ff006428e8fe6f5b648894d"} Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.147188 4721 generic.go:334] "Generic (PLEG): container finished" podID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerID="1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995" exitCode=0 Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.147275 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqxjr" event={"ID":"fca822b5-78c8-47d9-9cc5-266118a2b5aa","Type":"ContainerDied","Data":"1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995"} Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.147307 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqxjr" event={"ID":"fca822b5-78c8-47d9-9cc5-266118a2b5aa","Type":"ContainerStarted","Data":"47060e9c65a5aeb6ad4fedeb9a16d1cc215190986f8d21247996042fbd85459e"} Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.155185 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"598f8872-99ec-4855-9124-07a34b4ceaf9","Type":"ContainerStarted","Data":"dcf732d3a734d4bb94ed49bdb0e62cb781c98472db6dbf2ecb3f859e26f8167a"} Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.161868 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.161847426 podStartE2EDuration="3.161847426s" podCreationTimestamp="2026-02-02 13:03:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:43.157921725 +0000 UTC m=+163.460436134" watchObservedRunningTime="2026-02-02 13:03:43.161847426 +0000 UTC m=+163.464361835" Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.174514 4721 generic.go:334] "Generic (PLEG): container finished" podID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerID="3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e" exitCode=0 Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.174588 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jcv9" event={"ID":"d0f9579e-e58d-40b4-82c4-83111bfa9735","Type":"ContainerDied","Data":"3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e"} Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.177331 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.177313534 podStartE2EDuration="3.177313534s" podCreationTimestamp="2026-02-02 13:03:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:03:43.175122633 +0000 UTC m=+163.477637032" watchObservedRunningTime="2026-02-02 13:03:43.177313534 +0000 UTC m=+163.479827933" Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.398112 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:43 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:43 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:43 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:43 crc kubenswrapper[4721]: I0202 13:03:43.398611 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:44 crc kubenswrapper[4721]: I0202 13:03:44.189029 4721 generic.go:334] "Generic (PLEG): container finished" podID="b8d8781e-3259-4d55-b0d2-968979b5cd99" containerID="ad974937e7efafb3ad0f3ccfafa676f54039c93d4ff006428e8fe6f5b648894d" exitCode=0 Feb 02 13:03:44 crc kubenswrapper[4721]: I0202 13:03:44.189368 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b8d8781e-3259-4d55-b0d2-968979b5cd99","Type":"ContainerDied","Data":"ad974937e7efafb3ad0f3ccfafa676f54039c93d4ff006428e8fe6f5b648894d"} Feb 02 13:03:44 crc kubenswrapper[4721]: I0202 13:03:44.195400 4721 generic.go:334] "Generic (PLEG): container finished" podID="598f8872-99ec-4855-9124-07a34b4ceaf9" containerID="dcf732d3a734d4bb94ed49bdb0e62cb781c98472db6dbf2ecb3f859e26f8167a" exitCode=0 Feb 02 13:03:44 crc kubenswrapper[4721]: I0202 13:03:44.195545 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"598f8872-99ec-4855-9124-07a34b4ceaf9","Type":"ContainerDied","Data":"dcf732d3a734d4bb94ed49bdb0e62cb781c98472db6dbf2ecb3f859e26f8167a"} Feb 02 13:03:44 crc kubenswrapper[4721]: I0202 13:03:44.394661 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:44 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:44 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:44 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:44 crc kubenswrapper[4721]: I0202 13:03:44.394767 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:44 crc kubenswrapper[4721]: I0202 13:03:44.765053 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:03:44 crc kubenswrapper[4721]: I0202 13:03:44.765137 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.395271 4721 patch_prober.go:28] interesting pod/router-default-5444994796-7vhgv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 13:03:45 crc kubenswrapper[4721]: [-]has-synced failed: reason withheld Feb 02 13:03:45 crc kubenswrapper[4721]: [+]process-running ok Feb 02 13:03:45 crc kubenswrapper[4721]: healthz check failed Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.395342 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7vhgv" podUID="b2cdcb8a-fc4a-4f72-8a37-c7c0351ca2eb" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.736012 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.813756 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.844964 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8d8781e-3259-4d55-b0d2-968979b5cd99-kubelet-dir\") pod \"b8d8781e-3259-4d55-b0d2-968979b5cd99\" (UID: \"b8d8781e-3259-4d55-b0d2-968979b5cd99\") " Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.845220 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8d8781e-3259-4d55-b0d2-968979b5cd99-kube-api-access\") pod \"b8d8781e-3259-4d55-b0d2-968979b5cd99\" (UID: \"b8d8781e-3259-4d55-b0d2-968979b5cd99\") " Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.845115 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8d8781e-3259-4d55-b0d2-968979b5cd99-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b8d8781e-3259-4d55-b0d2-968979b5cd99" (UID: "b8d8781e-3259-4d55-b0d2-968979b5cd99"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.845772 4721 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8d8781e-3259-4d55-b0d2-968979b5cd99-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.861297 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d8781e-3259-4d55-b0d2-968979b5cd99-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b8d8781e-3259-4d55-b0d2-968979b5cd99" (UID: "b8d8781e-3259-4d55-b0d2-968979b5cd99"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.947101 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/598f8872-99ec-4855-9124-07a34b4ceaf9-kubelet-dir\") pod \"598f8872-99ec-4855-9124-07a34b4ceaf9\" (UID: \"598f8872-99ec-4855-9124-07a34b4ceaf9\") " Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.947184 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/598f8872-99ec-4855-9124-07a34b4ceaf9-kube-api-access\") pod \"598f8872-99ec-4855-9124-07a34b4ceaf9\" (UID: \"598f8872-99ec-4855-9124-07a34b4ceaf9\") " Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.947493 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8d8781e-3259-4d55-b0d2-968979b5cd99-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.947882 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/598f8872-99ec-4855-9124-07a34b4ceaf9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "598f8872-99ec-4855-9124-07a34b4ceaf9" (UID: "598f8872-99ec-4855-9124-07a34b4ceaf9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:03:45 crc kubenswrapper[4721]: I0202 13:03:45.954089 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598f8872-99ec-4855-9124-07a34b4ceaf9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "598f8872-99ec-4855-9124-07a34b4ceaf9" (UID: "598f8872-99ec-4855-9124-07a34b4ceaf9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.054158 4721 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/598f8872-99ec-4855-9124-07a34b4ceaf9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.054202 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/598f8872-99ec-4855-9124-07a34b4ceaf9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.220782 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.220823 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b8d8781e-3259-4d55-b0d2-968979b5cd99","Type":"ContainerDied","Data":"1082372af7a0adcfdeb24cb7893016cd78e285acfcc8814c2bd902fe4631a01a"} Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.220884 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1082372af7a0adcfdeb24cb7893016cd78e285acfcc8814c2bd902fe4631a01a" Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.225735 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"598f8872-99ec-4855-9124-07a34b4ceaf9","Type":"ContainerDied","Data":"e681de4a8185193feb8aeae9c251e9c185dfaa6492e65f34c92fdd330062b5b0"} Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.225772 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e681de4a8185193feb8aeae9c251e9c185dfaa6492e65f34c92fdd330062b5b0" Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.225835 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.395317 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.404387 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-7vhgv" Feb 02 13:03:46 crc kubenswrapper[4721]: I0202 13:03:46.852289 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kh9ph" Feb 02 13:03:48 crc kubenswrapper[4721]: I0202 13:03:48.116201 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:48 crc kubenswrapper[4721]: I0202 13:03:48.123615 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfab3ffb-8798-423d-9b55-83868b76a14e-metrics-certs\") pod \"network-metrics-daemon-xqz79\" (UID: \"bfab3ffb-8798-423d-9b55-83868b76a14e\") " pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:48 crc kubenswrapper[4721]: I0202 13:03:48.147089 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xqz79" Feb 02 13:03:51 crc kubenswrapper[4721]: I0202 13:03:51.019764 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zt9ng" Feb 02 13:03:51 crc kubenswrapper[4721]: I0202 13:03:51.123348 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:51 crc kubenswrapper[4721]: I0202 13:03:51.127786 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:03:58 crc kubenswrapper[4721]: I0202 13:03:58.823309 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:04:07 crc kubenswrapper[4721]: I0202 13:04:07.444691 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 13:04:07 crc kubenswrapper[4721]: E0202 13:04:07.707421 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 02 13:04:07 crc kubenswrapper[4721]: E0202 13:04:07.707907 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srrj7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ftf6s_openshift-marketplace(7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 13:04:07 crc kubenswrapper[4721]: E0202 13:04:07.709868 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ftf6s" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" Feb 02 13:04:11 crc kubenswrapper[4721]: E0202 13:04:11.352045 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ftf6s" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" Feb 02 13:04:11 crc kubenswrapper[4721]: I0202 13:04:11.722817 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2wzzt" Feb 02 13:04:13 crc kubenswrapper[4721]: E0202 13:04:13.540254 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 02 13:04:13 crc kubenswrapper[4721]: E0202 13:04:13.545062 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jdlcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hpqtk_openshift-marketplace(5b5702e4-96dd-479b-871a-d69bfdba91e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 13:04:13 crc kubenswrapper[4721]: E0202 13:04:13.546305 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hpqtk" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" Feb 02 13:04:13 crc kubenswrapper[4721]: E0202 13:04:13.590905 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 02 13:04:13 crc kubenswrapper[4721]: E0202 13:04:13.591583 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgmqw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-s2tcj_openshift-marketplace(c32d34a1-8dd8-435d-9491-748392c25b97): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 13:04:13 crc kubenswrapper[4721]: E0202 13:04:13.592848 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-s2tcj" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" Feb 02 13:04:13 crc kubenswrapper[4721]: I0202 13:04:13.974996 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xqz79"] Feb 02 13:04:14 crc kubenswrapper[4721]: W0202 13:04:14.009440 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfab3ffb_8798_423d_9b55_83868b76a14e.slice/crio-887920ec11b80f3e5c08efc430c8a9513fd6b9bc0af16af51b3800b7e577e87c WatchSource:0}: Error finding container 887920ec11b80f3e5c08efc430c8a9513fd6b9bc0af16af51b3800b7e577e87c: Status 404 returned error can't find the container with id 887920ec11b80f3e5c08efc430c8a9513fd6b9bc0af16af51b3800b7e577e87c Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.158887 4721 generic.go:334] "Generic (PLEG): container finished" podID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerID="3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3" exitCode=0 Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.159237 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95btx" event={"ID":"3275da10-006d-43c9-bdd6-46282b8ac9d1","Type":"ContainerDied","Data":"3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3"} Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.161992 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xqz79" event={"ID":"bfab3ffb-8798-423d-9b55-83868b76a14e","Type":"ContainerStarted","Data":"887920ec11b80f3e5c08efc430c8a9513fd6b9bc0af16af51b3800b7e577e87c"} Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.168240 4721 generic.go:334] "Generic (PLEG): container finished" podID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerID="90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b" exitCode=0 Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.168314 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jcv9" event={"ID":"d0f9579e-e58d-40b4-82c4-83111bfa9735","Type":"ContainerDied","Data":"90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b"} Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.172585 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm5t7" event={"ID":"b97707af-edd5-4907-9459-615b32a005e6","Type":"ContainerStarted","Data":"0d83a50e35994a069f84ac2fbbbdd424065961585a6b4d7fa1391779a81dfd2b"} Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.175801 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqxjr" event={"ID":"fca822b5-78c8-47d9-9cc5-266118a2b5aa","Type":"ContainerStarted","Data":"6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe"} Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.182680 4721 generic.go:334] "Generic (PLEG): container finished" podID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerID="e1bc2f4e0704a9ccfc7c0081b3e11299c3f9620d4305892b40a44031af50c66b" exitCode=0 Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.183392 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75gx6" event={"ID":"b11b9dcc-682e-48c6-9948-78aafcaf9e36","Type":"ContainerDied","Data":"e1bc2f4e0704a9ccfc7c0081b3e11299c3f9620d4305892b40a44031af50c66b"} Feb 02 13:04:14 crc kubenswrapper[4721]: E0202 13:04:14.185215 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hpqtk" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" Feb 02 13:04:14 crc kubenswrapper[4721]: E0202 13:04:14.186810 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-s2tcj" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.764164 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:04:14 crc kubenswrapper[4721]: I0202 13:04:14.764242 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.191978 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jcv9" event={"ID":"d0f9579e-e58d-40b4-82c4-83111bfa9735","Type":"ContainerStarted","Data":"c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563"} Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.200367 4721 generic.go:334] "Generic (PLEG): container finished" podID="b97707af-edd5-4907-9459-615b32a005e6" containerID="0d83a50e35994a069f84ac2fbbbdd424065961585a6b4d7fa1391779a81dfd2b" exitCode=0 Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.200488 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm5t7" event={"ID":"b97707af-edd5-4907-9459-615b32a005e6","Type":"ContainerDied","Data":"0d83a50e35994a069f84ac2fbbbdd424065961585a6b4d7fa1391779a81dfd2b"} Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.202434 4721 generic.go:334] "Generic (PLEG): container finished" podID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerID="6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe" exitCode=0 Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.202497 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqxjr" event={"ID":"fca822b5-78c8-47d9-9cc5-266118a2b5aa","Type":"ContainerDied","Data":"6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe"} Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.209433 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75gx6" event={"ID":"b11b9dcc-682e-48c6-9948-78aafcaf9e36","Type":"ContainerStarted","Data":"7fb035f08da78b4db85bc1ea0be2acf6c348cb105b55a68b2cac0ebf26b8bc1a"} Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.212421 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7jcv9" podStartSLOduration=3.610797873 podStartE2EDuration="35.212393736s" podCreationTimestamp="2026-02-02 13:03:40 +0000 UTC" firstStartedPulling="2026-02-02 13:03:43.179492196 +0000 UTC m=+163.482006585" lastFinishedPulling="2026-02-02 13:04:14.781088059 +0000 UTC m=+195.083602448" observedRunningTime="2026-02-02 13:04:15.207408376 +0000 UTC m=+195.509922765" watchObservedRunningTime="2026-02-02 13:04:15.212393736 +0000 UTC m=+195.514908125" Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.213923 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95btx" event={"ID":"3275da10-006d-43c9-bdd6-46282b8ac9d1","Type":"ContainerStarted","Data":"94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8"} Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.216555 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xqz79" event={"ID":"bfab3ffb-8798-423d-9b55-83868b76a14e","Type":"ContainerStarted","Data":"9b833fb79bf3786baf344597d3cdbbaeab1780723f8978fbcfe0918a478d52e6"} Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.216593 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xqz79" event={"ID":"bfab3ffb-8798-423d-9b55-83868b76a14e","Type":"ContainerStarted","Data":"2d59ca22026c845d2ad7159f1f4ba021a35d1cc4512b4a8368734bde90d61845"} Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.272954 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-95btx" podStartSLOduration=2.649222793 podStartE2EDuration="37.272934401s" podCreationTimestamp="2026-02-02 13:03:38 +0000 UTC" firstStartedPulling="2026-02-02 13:03:40.050220456 +0000 UTC m=+160.352734845" lastFinishedPulling="2026-02-02 13:04:14.673932064 +0000 UTC m=+194.976446453" observedRunningTime="2026-02-02 13:04:15.262568978 +0000 UTC m=+195.565083367" watchObservedRunningTime="2026-02-02 13:04:15.272934401 +0000 UTC m=+195.575448790" Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.286790 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xqz79" podStartSLOduration=170.286771303 podStartE2EDuration="2m50.286771303s" podCreationTimestamp="2026-02-02 13:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:04:15.282985427 +0000 UTC m=+195.585499816" watchObservedRunningTime="2026-02-02 13:04:15.286771303 +0000 UTC m=+195.589285692" Feb 02 13:04:15 crc kubenswrapper[4721]: I0202 13:04:15.308681 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-75gx6" podStartSLOduration=1.661019316 podStartE2EDuration="35.308663574s" podCreationTimestamp="2026-02-02 13:03:40 +0000 UTC" firstStartedPulling="2026-02-02 13:03:41.067791781 +0000 UTC m=+161.370306170" lastFinishedPulling="2026-02-02 13:04:14.715436039 +0000 UTC m=+195.017950428" observedRunningTime="2026-02-02 13:04:15.306481721 +0000 UTC m=+195.608996120" watchObservedRunningTime="2026-02-02 13:04:15.308663574 +0000 UTC m=+195.611177963" Feb 02 13:04:16 crc kubenswrapper[4721]: I0202 13:04:16.225908 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm5t7" event={"ID":"b97707af-edd5-4907-9459-615b32a005e6","Type":"ContainerStarted","Data":"33ab21456de0805136a88c1d6ecd74979aabd0d3bfbd8cc958f6e35c18ed050d"} Feb 02 13:04:16 crc kubenswrapper[4721]: I0202 13:04:16.228086 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqxjr" event={"ID":"fca822b5-78c8-47d9-9cc5-266118a2b5aa","Type":"ContainerStarted","Data":"d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21"} Feb 02 13:04:16 crc kubenswrapper[4721]: I0202 13:04:16.276107 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pm5t7" podStartSLOduration=2.712960337 podStartE2EDuration="35.276045926s" podCreationTimestamp="2026-02-02 13:03:41 +0000 UTC" firstStartedPulling="2026-02-02 13:03:43.130838248 +0000 UTC m=+163.433352637" lastFinishedPulling="2026-02-02 13:04:15.693923837 +0000 UTC m=+195.996438226" observedRunningTime="2026-02-02 13:04:16.250381479 +0000 UTC m=+196.552895888" watchObservedRunningTime="2026-02-02 13:04:16.276045926 +0000 UTC m=+196.578560335" Feb 02 13:04:18 crc kubenswrapper[4721]: I0202 13:04:18.951997 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-95btx" Feb 02 13:04:18 crc kubenswrapper[4721]: I0202 13:04:18.952355 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-95btx" Feb 02 13:04:19 crc kubenswrapper[4721]: I0202 13:04:19.194170 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-95btx" Feb 02 13:04:19 crc kubenswrapper[4721]: I0202 13:04:19.212788 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xqxjr" podStartSLOduration=6.809538319 podStartE2EDuration="38.212763223s" podCreationTimestamp="2026-02-02 13:03:41 +0000 UTC" firstStartedPulling="2026-02-02 13:03:44.198037058 +0000 UTC m=+164.500551437" lastFinishedPulling="2026-02-02 13:04:15.601261952 +0000 UTC m=+195.903776341" observedRunningTime="2026-02-02 13:04:16.276640903 +0000 UTC m=+196.579155302" watchObservedRunningTime="2026-02-02 13:04:19.212763223 +0000 UTC m=+199.515277622" Feb 02 13:04:19 crc kubenswrapper[4721]: I0202 13:04:19.308479 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-95btx" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.208817 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 13:04:20 crc kubenswrapper[4721]: E0202 13:04:20.209042 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598f8872-99ec-4855-9124-07a34b4ceaf9" containerName="pruner" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.209055 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="598f8872-99ec-4855-9124-07a34b4ceaf9" containerName="pruner" Feb 02 13:04:20 crc kubenswrapper[4721]: E0202 13:04:20.209085 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d8781e-3259-4d55-b0d2-968979b5cd99" containerName="pruner" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.209091 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d8781e-3259-4d55-b0d2-968979b5cd99" containerName="pruner" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.209201 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d8781e-3259-4d55-b0d2-968979b5cd99" containerName="pruner" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.209212 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="598f8872-99ec-4855-9124-07a34b4ceaf9" containerName="pruner" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.209560 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.213242 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.218061 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.226617 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.283166 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/429dff39-5ef0-4e43-99bc-30771e26645d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"429dff39-5ef0-4e43-99bc-30771e26645d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.283234 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/429dff39-5ef0-4e43-99bc-30771e26645d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"429dff39-5ef0-4e43-99bc-30771e26645d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.388738 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/429dff39-5ef0-4e43-99bc-30771e26645d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"429dff39-5ef0-4e43-99bc-30771e26645d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.388803 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/429dff39-5ef0-4e43-99bc-30771e26645d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"429dff39-5ef0-4e43-99bc-30771e26645d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.388890 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/429dff39-5ef0-4e43-99bc-30771e26645d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"429dff39-5ef0-4e43-99bc-30771e26645d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.400834 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95btx"] Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.412484 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/429dff39-5ef0-4e43-99bc-30771e26645d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"429dff39-5ef0-4e43-99bc-30771e26645d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.534725 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.683271 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.683700 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:04:20 crc kubenswrapper[4721]: I0202 13:04:20.735929 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.011573 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.115381 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.115860 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.178953 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.273521 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"429dff39-5ef0-4e43-99bc-30771e26645d","Type":"ContainerStarted","Data":"b69839db7fc14233bb04f6125cec3d414151d8a54e4db751a1dca8cdc74f05a2"} Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.273924 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-95btx" podUID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerName="registry-server" containerID="cri-o://94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8" gracePeriod=2 Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.320922 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.367632 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.791305 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:04:21 crc kubenswrapper[4721]: I0202 13:04:21.791394 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.174095 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95btx" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.212297 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.212353 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.250699 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.292449 4721 generic.go:334] "Generic (PLEG): container finished" podID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerID="94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8" exitCode=0 Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.292514 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95btx" event={"ID":"3275da10-006d-43c9-bdd6-46282b8ac9d1","Type":"ContainerDied","Data":"94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8"} Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.292564 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95btx" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.292584 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95btx" event={"ID":"3275da10-006d-43c9-bdd6-46282b8ac9d1","Type":"ContainerDied","Data":"d1d7313e7e9488250223e5d45503d08f43af174101c199e67ce71331531c040b"} Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.292608 4721 scope.go:117] "RemoveContainer" containerID="94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.295320 4721 generic.go:334] "Generic (PLEG): container finished" podID="429dff39-5ef0-4e43-99bc-30771e26645d" containerID="f888fad76be81cd7d07bc8c827b1097ff3e61b82f112b31db96020604819b736" exitCode=0 Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.295396 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"429dff39-5ef0-4e43-99bc-30771e26645d","Type":"ContainerDied","Data":"f888fad76be81cd7d07bc8c827b1097ff3e61b82f112b31db96020604819b736"} Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.312846 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-catalog-content\") pod \"3275da10-006d-43c9-bdd6-46282b8ac9d1\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.313096 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-utilities\") pod \"3275da10-006d-43c9-bdd6-46282b8ac9d1\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.313283 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8cfp\" (UniqueName: \"kubernetes.io/projected/3275da10-006d-43c9-bdd6-46282b8ac9d1-kube-api-access-q8cfp\") pod \"3275da10-006d-43c9-bdd6-46282b8ac9d1\" (UID: \"3275da10-006d-43c9-bdd6-46282b8ac9d1\") " Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.314249 4721 scope.go:117] "RemoveContainer" containerID="3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.315215 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-utilities" (OuterVolumeSpecName: "utilities") pod "3275da10-006d-43c9-bdd6-46282b8ac9d1" (UID: "3275da10-006d-43c9-bdd6-46282b8ac9d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.320602 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3275da10-006d-43c9-bdd6-46282b8ac9d1-kube-api-access-q8cfp" (OuterVolumeSpecName: "kube-api-access-q8cfp") pod "3275da10-006d-43c9-bdd6-46282b8ac9d1" (UID: "3275da10-006d-43c9-bdd6-46282b8ac9d1"). InnerVolumeSpecName "kube-api-access-q8cfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.338183 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.353043 4721 scope.go:117] "RemoveContainer" containerID="e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.377224 4721 scope.go:117] "RemoveContainer" containerID="94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8" Feb 02 13:04:22 crc kubenswrapper[4721]: E0202 13:04:22.377842 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8\": container with ID starting with 94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8 not found: ID does not exist" containerID="94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.377885 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8"} err="failed to get container status \"94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8\": rpc error: code = NotFound desc = could not find container \"94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8\": container with ID starting with 94276164198a8658973658b0a07a815f463d34dcaac89175ee34daf72aa614d8 not found: ID does not exist" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.377936 4721 scope.go:117] "RemoveContainer" containerID="3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3" Feb 02 13:04:22 crc kubenswrapper[4721]: E0202 13:04:22.378630 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3\": container with ID starting with 3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3 not found: ID does not exist" containerID="3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.378760 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3"} err="failed to get container status \"3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3\": rpc error: code = NotFound desc = could not find container \"3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3\": container with ID starting with 3a799bbd42f589f408396d566c14a41d1e41856170a302c97a49ff32d09288f3 not found: ID does not exist" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.378881 4721 scope.go:117] "RemoveContainer" containerID="e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0" Feb 02 13:04:22 crc kubenswrapper[4721]: E0202 13:04:22.379494 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0\": container with ID starting with e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0 not found: ID does not exist" containerID="e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.379519 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0"} err="failed to get container status \"e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0\": rpc error: code = NotFound desc = could not find container \"e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0\": container with ID starting with e6fbf9234354b68f2d074039ff57046d5bb0dc1205805a01d7f0f4b5a8698da0 not found: ID does not exist" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.390867 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3275da10-006d-43c9-bdd6-46282b8ac9d1" (UID: "3275da10-006d-43c9-bdd6-46282b8ac9d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.415528 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.415577 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8cfp\" (UniqueName: \"kubernetes.io/projected/3275da10-006d-43c9-bdd6-46282b8ac9d1-kube-api-access-q8cfp\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.415595 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3275da10-006d-43c9-bdd6-46282b8ac9d1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.611686 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95btx"] Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.615687 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-95btx"] Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.829282 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pm5t7" podUID="b97707af-edd5-4907-9459-615b32a005e6" containerName="registry-server" probeResult="failure" output=< Feb 02 13:04:22 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:04:22 crc kubenswrapper[4721]: > Feb 02 13:04:22 crc kubenswrapper[4721]: I0202 13:04:22.994889 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jcv9"] Feb 02 13:04:23 crc kubenswrapper[4721]: I0202 13:04:23.305090 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7jcv9" podUID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerName="registry-server" containerID="cri-o://c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563" gracePeriod=2 Feb 02 13:04:23 crc kubenswrapper[4721]: I0202 13:04:23.610397 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:23 crc kubenswrapper[4721]: I0202 13:04:23.736522 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/429dff39-5ef0-4e43-99bc-30771e26645d-kube-api-access\") pod \"429dff39-5ef0-4e43-99bc-30771e26645d\" (UID: \"429dff39-5ef0-4e43-99bc-30771e26645d\") " Feb 02 13:04:23 crc kubenswrapper[4721]: I0202 13:04:23.736623 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/429dff39-5ef0-4e43-99bc-30771e26645d-kubelet-dir\") pod \"429dff39-5ef0-4e43-99bc-30771e26645d\" (UID: \"429dff39-5ef0-4e43-99bc-30771e26645d\") " Feb 02 13:04:23 crc kubenswrapper[4721]: I0202 13:04:23.736929 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/429dff39-5ef0-4e43-99bc-30771e26645d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "429dff39-5ef0-4e43-99bc-30771e26645d" (UID: "429dff39-5ef0-4e43-99bc-30771e26645d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:04:23 crc kubenswrapper[4721]: I0202 13:04:23.741562 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429dff39-5ef0-4e43-99bc-30771e26645d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "429dff39-5ef0-4e43-99bc-30771e26645d" (UID: "429dff39-5ef0-4e43-99bc-30771e26645d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:04:23 crc kubenswrapper[4721]: I0202 13:04:23.837834 4721 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/429dff39-5ef0-4e43-99bc-30771e26645d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:23 crc kubenswrapper[4721]: I0202 13:04:23.837868 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/429dff39-5ef0-4e43-99bc-30771e26645d-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.252805 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.313031 4721 generic.go:334] "Generic (PLEG): container finished" podID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerID="c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563" exitCode=0 Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.313110 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jcv9" event={"ID":"d0f9579e-e58d-40b4-82c4-83111bfa9735","Type":"ContainerDied","Data":"c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563"} Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.313133 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jcv9" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.313150 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jcv9" event={"ID":"d0f9579e-e58d-40b4-82c4-83111bfa9735","Type":"ContainerDied","Data":"00041994ab43d697d41af55566c6a0ac8e00b0660330d7b33111225ed94d785c"} Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.313169 4721 scope.go:117] "RemoveContainer" containerID="c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.315671 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftf6s" event={"ID":"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b","Type":"ContainerStarted","Data":"9828d087ebd0a3d61cf032280a26f38ac894879bfffd8ab8dc7b9c9e262b96fd"} Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.320430 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"429dff39-5ef0-4e43-99bc-30771e26645d","Type":"ContainerDied","Data":"b69839db7fc14233bb04f6125cec3d414151d8a54e4db751a1dca8cdc74f05a2"} Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.320492 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b69839db7fc14233bb04f6125cec3d414151d8a54e4db751a1dca8cdc74f05a2" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.320612 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.338285 4721 scope.go:117] "RemoveContainer" containerID="90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.352556 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24wc6\" (UniqueName: \"kubernetes.io/projected/d0f9579e-e58d-40b4-82c4-83111bfa9735-kube-api-access-24wc6\") pod \"d0f9579e-e58d-40b4-82c4-83111bfa9735\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.352674 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-utilities\") pod \"d0f9579e-e58d-40b4-82c4-83111bfa9735\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.352793 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-catalog-content\") pod \"d0f9579e-e58d-40b4-82c4-83111bfa9735\" (UID: \"d0f9579e-e58d-40b4-82c4-83111bfa9735\") " Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.353603 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-utilities" (OuterVolumeSpecName: "utilities") pod "d0f9579e-e58d-40b4-82c4-83111bfa9735" (UID: "d0f9579e-e58d-40b4-82c4-83111bfa9735"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.358258 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0f9579e-e58d-40b4-82c4-83111bfa9735-kube-api-access-24wc6" (OuterVolumeSpecName: "kube-api-access-24wc6") pod "d0f9579e-e58d-40b4-82c4-83111bfa9735" (UID: "d0f9579e-e58d-40b4-82c4-83111bfa9735"). InnerVolumeSpecName "kube-api-access-24wc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.363926 4721 scope.go:117] "RemoveContainer" containerID="3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.377166 4721 scope.go:117] "RemoveContainer" containerID="c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563" Feb 02 13:04:24 crc kubenswrapper[4721]: E0202 13:04:24.377555 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563\": container with ID starting with c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563 not found: ID does not exist" containerID="c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.377586 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563"} err="failed to get container status \"c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563\": rpc error: code = NotFound desc = could not find container \"c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563\": container with ID starting with c8f57b53de24ab26aa99f7a4280707e1dd37483fb97ed167765b2081fa94f563 not found: ID does not exist" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.377608 4721 scope.go:117] "RemoveContainer" containerID="90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b" Feb 02 13:04:24 crc kubenswrapper[4721]: E0202 13:04:24.377860 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b\": container with ID starting with 90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b not found: ID does not exist" containerID="90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.377887 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b"} err="failed to get container status \"90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b\": rpc error: code = NotFound desc = could not find container \"90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b\": container with ID starting with 90052ca542d1d90d3ece1db6282e32c41dc45e759cbb4a25db16d8178731746b not found: ID does not exist" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.377908 4721 scope.go:117] "RemoveContainer" containerID="3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e" Feb 02 13:04:24 crc kubenswrapper[4721]: E0202 13:04:24.378214 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e\": container with ID starting with 3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e not found: ID does not exist" containerID="3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.378281 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e"} err="failed to get container status \"3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e\": rpc error: code = NotFound desc = could not find container \"3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e\": container with ID starting with 3ccd0f09abd6b2021524950faafaef3b13ab04a32e2570fdc710dd7ecb17b03e not found: ID does not exist" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.383759 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0f9579e-e58d-40b4-82c4-83111bfa9735" (UID: "d0f9579e-e58d-40b4-82c4-83111bfa9735"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.418154 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3275da10-006d-43c9-bdd6-46282b8ac9d1" path="/var/lib/kubelet/pods/3275da10-006d-43c9-bdd6-46282b8ac9d1/volumes" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.454223 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.454774 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24wc6\" (UniqueName: \"kubernetes.io/projected/d0f9579e-e58d-40b4-82c4-83111bfa9735-kube-api-access-24wc6\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.454788 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0f9579e-e58d-40b4-82c4-83111bfa9735-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.627715 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jcv9"] Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.636845 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jcv9"] Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.793606 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xqxjr"] Feb 02 13:04:24 crc kubenswrapper[4721]: I0202 13:04:24.793961 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xqxjr" podUID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerName="registry-server" containerID="cri-o://d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21" gracePeriod=2 Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009180 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.009401 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerName="extract-content" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009419 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerName="extract-content" Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.009432 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerName="extract-utilities" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009439 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerName="extract-utilities" Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.009447 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerName="extract-content" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009453 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerName="extract-content" Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.009466 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerName="registry-server" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009472 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerName="registry-server" Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.009485 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerName="registry-server" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009490 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerName="registry-server" Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.009499 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429dff39-5ef0-4e43-99bc-30771e26645d" containerName="pruner" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009505 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="429dff39-5ef0-4e43-99bc-30771e26645d" containerName="pruner" Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.009514 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerName="extract-utilities" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009521 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerName="extract-utilities" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009606 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0f9579e-e58d-40b4-82c4-83111bfa9735" containerName="registry-server" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009616 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3275da10-006d-43c9-bdd6-46282b8ac9d1" containerName="registry-server" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009623 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="429dff39-5ef0-4e43-99bc-30771e26645d" containerName="pruner" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.009970 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.011949 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.015493 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.015937 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.161862 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9755d24d-ee48-44a4-aa63-5b014999e3a9-kube-api-access\") pod \"installer-9-crc\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.162229 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-var-lock\") pod \"installer-9-crc\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.162318 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.198759 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.263115 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8zdz\" (UniqueName: \"kubernetes.io/projected/fca822b5-78c8-47d9-9cc5-266118a2b5aa-kube-api-access-t8zdz\") pod \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.263303 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-catalog-content\") pod \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.263339 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-utilities\") pod \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\" (UID: \"fca822b5-78c8-47d9-9cc5-266118a2b5aa\") " Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.263482 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9755d24d-ee48-44a4-aa63-5b014999e3a9-kube-api-access\") pod \"installer-9-crc\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.263519 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-var-lock\") pod \"installer-9-crc\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.263561 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.263651 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.265253 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-utilities" (OuterVolumeSpecName: "utilities") pod "fca822b5-78c8-47d9-9cc5-266118a2b5aa" (UID: "fca822b5-78c8-47d9-9cc5-266118a2b5aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.268346 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-var-lock\") pod \"installer-9-crc\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.271358 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fca822b5-78c8-47d9-9cc5-266118a2b5aa-kube-api-access-t8zdz" (OuterVolumeSpecName: "kube-api-access-t8zdz") pod "fca822b5-78c8-47d9-9cc5-266118a2b5aa" (UID: "fca822b5-78c8-47d9-9cc5-266118a2b5aa"). InnerVolumeSpecName "kube-api-access-t8zdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.287377 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9755d24d-ee48-44a4-aa63-5b014999e3a9-kube-api-access\") pod \"installer-9-crc\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.326991 4721 generic.go:334] "Generic (PLEG): container finished" podID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerID="9828d087ebd0a3d61cf032280a26f38ac894879bfffd8ab8dc7b9c9e262b96fd" exitCode=0 Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.327054 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftf6s" event={"ID":"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b","Type":"ContainerDied","Data":"9828d087ebd0a3d61cf032280a26f38ac894879bfffd8ab8dc7b9c9e262b96fd"} Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.331179 4721 generic.go:334] "Generic (PLEG): container finished" podID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerID="d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21" exitCode=0 Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.331226 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqxjr" event={"ID":"fca822b5-78c8-47d9-9cc5-266118a2b5aa","Type":"ContainerDied","Data":"d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21"} Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.331270 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqxjr" event={"ID":"fca822b5-78c8-47d9-9cc5-266118a2b5aa","Type":"ContainerDied","Data":"47060e9c65a5aeb6ad4fedeb9a16d1cc215190986f8d21247996042fbd85459e"} Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.331258 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqxjr" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.331285 4721 scope.go:117] "RemoveContainer" containerID="d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.337796 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.357661 4721 scope.go:117] "RemoveContainer" containerID="6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.367276 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.367313 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8zdz\" (UniqueName: \"kubernetes.io/projected/fca822b5-78c8-47d9-9cc5-266118a2b5aa-kube-api-access-t8zdz\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.378539 4721 scope.go:117] "RemoveContainer" containerID="1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.402713 4721 scope.go:117] "RemoveContainer" containerID="d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21" Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.404269 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21\": container with ID starting with d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21 not found: ID does not exist" containerID="d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.404314 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21"} err="failed to get container status \"d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21\": rpc error: code = NotFound desc = could not find container \"d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21\": container with ID starting with d88ecd85c5a2024374c4bb375130ae1821f595497ae59083522e66adc526cf21 not found: ID does not exist" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.404345 4721 scope.go:117] "RemoveContainer" containerID="6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe" Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.404653 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe\": container with ID starting with 6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe not found: ID does not exist" containerID="6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.404698 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe"} err="failed to get container status \"6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe\": rpc error: code = NotFound desc = could not find container \"6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe\": container with ID starting with 6b1f19b1332f89efa38077451d6165f2371bd16012b96e8fb0038bfe0c3a6dbe not found: ID does not exist" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.404724 4721 scope.go:117] "RemoveContainer" containerID="1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995" Feb 02 13:04:25 crc kubenswrapper[4721]: E0202 13:04:25.404962 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995\": container with ID starting with 1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995 not found: ID does not exist" containerID="1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.404995 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995"} err="failed to get container status \"1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995\": rpc error: code = NotFound desc = could not find container \"1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995\": container with ID starting with 1307da490b649bddfe1be4c1fdffa3d95bafee24aa3239fe6da07477129dd995 not found: ID does not exist" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.582867 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 13:04:25 crc kubenswrapper[4721]: W0202 13:04:25.592516 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9755d24d_ee48_44a4_aa63_5b014999e3a9.slice/crio-77bc31793d5cf8c9c2a6f1e7dc74fa1124392a5cef395c56ee4646291d25f2eb WatchSource:0}: Error finding container 77bc31793d5cf8c9c2a6f1e7dc74fa1124392a5cef395c56ee4646291d25f2eb: Status 404 returned error can't find the container with id 77bc31793d5cf8c9c2a6f1e7dc74fa1124392a5cef395c56ee4646291d25f2eb Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.716330 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fca822b5-78c8-47d9-9cc5-266118a2b5aa" (UID: "fca822b5-78c8-47d9-9cc5-266118a2b5aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.773764 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca822b5-78c8-47d9-9cc5-266118a2b5aa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.958946 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xqxjr"] Feb 02 13:04:25 crc kubenswrapper[4721]: I0202 13:04:25.963011 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xqxjr"] Feb 02 13:04:26 crc kubenswrapper[4721]: I0202 13:04:26.338524 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9755d24d-ee48-44a4-aa63-5b014999e3a9","Type":"ContainerStarted","Data":"ee2400de463f4d9467d730014ae34268bf8cc03fd437b95ba162e6c9f10ba80b"} Feb 02 13:04:26 crc kubenswrapper[4721]: I0202 13:04:26.338788 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9755d24d-ee48-44a4-aa63-5b014999e3a9","Type":"ContainerStarted","Data":"77bc31793d5cf8c9c2a6f1e7dc74fa1124392a5cef395c56ee4646291d25f2eb"} Feb 02 13:04:26 crc kubenswrapper[4721]: I0202 13:04:26.424955 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0f9579e-e58d-40b4-82c4-83111bfa9735" path="/var/lib/kubelet/pods/d0f9579e-e58d-40b4-82c4-83111bfa9735/volumes" Feb 02 13:04:26 crc kubenswrapper[4721]: I0202 13:04:26.426030 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" path="/var/lib/kubelet/pods/fca822b5-78c8-47d9-9cc5-266118a2b5aa/volumes" Feb 02 13:04:27 crc kubenswrapper[4721]: I0202 13:04:27.348042 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftf6s" event={"ID":"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b","Type":"ContainerStarted","Data":"410edca835d21d18be323b803ab840df97552992c0ec9f405104a39dad3828e0"} Feb 02 13:04:27 crc kubenswrapper[4721]: I0202 13:04:27.365732 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.365711874 podStartE2EDuration="3.365711874s" podCreationTimestamp="2026-02-02 13:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:04:27.364280903 +0000 UTC m=+207.666795292" watchObservedRunningTime="2026-02-02 13:04:27.365711874 +0000 UTC m=+207.668226263" Feb 02 13:04:27 crc kubenswrapper[4721]: I0202 13:04:27.394433 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ftf6s" podStartSLOduration=2.335025194 podStartE2EDuration="49.394406042s" podCreationTimestamp="2026-02-02 13:03:38 +0000 UTC" firstStartedPulling="2026-02-02 13:03:40.037664411 +0000 UTC m=+160.340178800" lastFinishedPulling="2026-02-02 13:04:27.097045239 +0000 UTC m=+207.399559648" observedRunningTime="2026-02-02 13:04:27.390222904 +0000 UTC m=+207.692737293" watchObservedRunningTime="2026-02-02 13:04:27.394406042 +0000 UTC m=+207.696920431" Feb 02 13:04:28 crc kubenswrapper[4721]: I0202 13:04:28.356628 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpqtk" event={"ID":"5b5702e4-96dd-479b-871a-d69bfdba91e1","Type":"ContainerStarted","Data":"cac3571fa985c8eb2f6d12ace5635fa81b355f00311311849ccc78de7cbf1b4c"} Feb 02 13:04:28 crc kubenswrapper[4721]: I0202 13:04:28.474540 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:04:28 crc kubenswrapper[4721]: I0202 13:04:28.474633 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:04:29 crc kubenswrapper[4721]: I0202 13:04:29.366374 4721 generic.go:334] "Generic (PLEG): container finished" podID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerID="cac3571fa985c8eb2f6d12ace5635fa81b355f00311311849ccc78de7cbf1b4c" exitCode=0 Feb 02 13:04:29 crc kubenswrapper[4721]: I0202 13:04:29.366480 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpqtk" event={"ID":"5b5702e4-96dd-479b-871a-d69bfdba91e1","Type":"ContainerDied","Data":"cac3571fa985c8eb2f6d12ace5635fa81b355f00311311849ccc78de7cbf1b4c"} Feb 02 13:04:29 crc kubenswrapper[4721]: I0202 13:04:29.512579 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-ftf6s" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerName="registry-server" probeResult="failure" output=< Feb 02 13:04:29 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:04:29 crc kubenswrapper[4721]: > Feb 02 13:04:31 crc kubenswrapper[4721]: I0202 13:04:31.835607 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:04:31 crc kubenswrapper[4721]: I0202 13:04:31.880925 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:04:32 crc kubenswrapper[4721]: I0202 13:04:32.383420 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpqtk" event={"ID":"5b5702e4-96dd-479b-871a-d69bfdba91e1","Type":"ContainerStarted","Data":"658375981ca92686d238c2129054336aaf19b4f61909f67142ef9963bb6dbfe9"} Feb 02 13:04:32 crc kubenswrapper[4721]: I0202 13:04:32.386396 4721 generic.go:334] "Generic (PLEG): container finished" podID="c32d34a1-8dd8-435d-9491-748392c25b97" containerID="cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76" exitCode=0 Feb 02 13:04:32 crc kubenswrapper[4721]: I0202 13:04:32.386465 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2tcj" event={"ID":"c32d34a1-8dd8-435d-9491-748392c25b97","Type":"ContainerDied","Data":"cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76"} Feb 02 13:04:32 crc kubenswrapper[4721]: I0202 13:04:32.419681 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hpqtk" podStartSLOduration=3.184057842 podStartE2EDuration="54.419663968s" podCreationTimestamp="2026-02-02 13:03:38 +0000 UTC" firstStartedPulling="2026-02-02 13:03:40.055242239 +0000 UTC m=+160.357756628" lastFinishedPulling="2026-02-02 13:04:31.290848365 +0000 UTC m=+211.593362754" observedRunningTime="2026-02-02 13:04:32.399107919 +0000 UTC m=+212.701622318" watchObservedRunningTime="2026-02-02 13:04:32.419663968 +0000 UTC m=+212.722178357" Feb 02 13:04:33 crc kubenswrapper[4721]: I0202 13:04:33.394078 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2tcj" event={"ID":"c32d34a1-8dd8-435d-9491-748392c25b97","Type":"ContainerStarted","Data":"53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970"} Feb 02 13:04:33 crc kubenswrapper[4721]: I0202 13:04:33.411685 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s2tcj" podStartSLOduration=2.669611636 podStartE2EDuration="55.41166968s" podCreationTimestamp="2026-02-02 13:03:38 +0000 UTC" firstStartedPulling="2026-02-02 13:03:40.045848963 +0000 UTC m=+160.348363352" lastFinishedPulling="2026-02-02 13:04:32.787907007 +0000 UTC m=+213.090421396" observedRunningTime="2026-02-02 13:04:33.411051403 +0000 UTC m=+213.713565792" watchObservedRunningTime="2026-02-02 13:04:33.41166968 +0000 UTC m=+213.714184059" Feb 02 13:04:38 crc kubenswrapper[4721]: I0202 13:04:38.511754 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:04:38 crc kubenswrapper[4721]: I0202 13:04:38.555249 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:04:38 crc kubenswrapper[4721]: I0202 13:04:38.674182 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:04:38 crc kubenswrapper[4721]: I0202 13:04:38.674290 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:04:38 crc kubenswrapper[4721]: I0202 13:04:38.725488 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:04:39 crc kubenswrapper[4721]: I0202 13:04:39.112861 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:04:39 crc kubenswrapper[4721]: I0202 13:04:39.112906 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:04:39 crc kubenswrapper[4721]: I0202 13:04:39.149110 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:04:39 crc kubenswrapper[4721]: I0202 13:04:39.463279 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:04:39 crc kubenswrapper[4721]: I0202 13:04:39.473089 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:04:40 crc kubenswrapper[4721]: I0202 13:04:40.318966 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fqbhq"] Feb 02 13:04:40 crc kubenswrapper[4721]: I0202 13:04:40.743335 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpqtk"] Feb 02 13:04:41 crc kubenswrapper[4721]: I0202 13:04:41.434535 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hpqtk" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerName="registry-server" containerID="cri-o://658375981ca92686d238c2129054336aaf19b4f61909f67142ef9963bb6dbfe9" gracePeriod=2 Feb 02 13:04:42 crc kubenswrapper[4721]: I0202 13:04:42.443208 4721 generic.go:334] "Generic (PLEG): container finished" podID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerID="658375981ca92686d238c2129054336aaf19b4f61909f67142ef9963bb6dbfe9" exitCode=0 Feb 02 13:04:42 crc kubenswrapper[4721]: I0202 13:04:42.443311 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpqtk" event={"ID":"5b5702e4-96dd-479b-871a-d69bfdba91e1","Type":"ContainerDied","Data":"658375981ca92686d238c2129054336aaf19b4f61909f67142ef9963bb6dbfe9"} Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.513973 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.603006 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-utilities\") pod \"5b5702e4-96dd-479b-871a-d69bfdba91e1\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.603430 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-catalog-content\") pod \"5b5702e4-96dd-479b-871a-d69bfdba91e1\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.603682 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdlcg\" (UniqueName: \"kubernetes.io/projected/5b5702e4-96dd-479b-871a-d69bfdba91e1-kube-api-access-jdlcg\") pod \"5b5702e4-96dd-479b-871a-d69bfdba91e1\" (UID: \"5b5702e4-96dd-479b-871a-d69bfdba91e1\") " Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.603823 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-utilities" (OuterVolumeSpecName: "utilities") pod "5b5702e4-96dd-479b-871a-d69bfdba91e1" (UID: "5b5702e4-96dd-479b-871a-d69bfdba91e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.604252 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.610054 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5702e4-96dd-479b-871a-d69bfdba91e1-kube-api-access-jdlcg" (OuterVolumeSpecName: "kube-api-access-jdlcg") pod "5b5702e4-96dd-479b-871a-d69bfdba91e1" (UID: "5b5702e4-96dd-479b-871a-d69bfdba91e1"). InnerVolumeSpecName "kube-api-access-jdlcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.651356 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b5702e4-96dd-479b-871a-d69bfdba91e1" (UID: "5b5702e4-96dd-479b-871a-d69bfdba91e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.704997 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b5702e4-96dd-479b-871a-d69bfdba91e1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:43 crc kubenswrapper[4721]: I0202 13:04:43.705032 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdlcg\" (UniqueName: \"kubernetes.io/projected/5b5702e4-96dd-479b-871a-d69bfdba91e1-kube-api-access-jdlcg\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.454187 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hpqtk" event={"ID":"5b5702e4-96dd-479b-871a-d69bfdba91e1","Type":"ContainerDied","Data":"cee4e973cd74c01297b1880257bab371fd0bb8e104dad9318438c355e90ddc2e"} Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.454284 4721 scope.go:117] "RemoveContainer" containerID="658375981ca92686d238c2129054336aaf19b4f61909f67142ef9963bb6dbfe9" Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.454301 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hpqtk" Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.467789 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hpqtk"] Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.474577 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hpqtk"] Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.478481 4721 scope.go:117] "RemoveContainer" containerID="cac3571fa985c8eb2f6d12ace5635fa81b355f00311311849ccc78de7cbf1b4c" Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.500538 4721 scope.go:117] "RemoveContainer" containerID="7fd0cce05f07f725f6f879eba319ea2fa9bcc093917d1f8c17be638fe958ae8a" Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.763826 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.763890 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.763933 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.764490 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:04:44 crc kubenswrapper[4721]: I0202 13:04:44.764548 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591" gracePeriod=600 Feb 02 13:04:45 crc kubenswrapper[4721]: I0202 13:04:45.464813 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591" exitCode=0 Feb 02 13:04:45 crc kubenswrapper[4721]: I0202 13:04:45.465046 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591"} Feb 02 13:04:46 crc kubenswrapper[4721]: I0202 13:04:46.415803 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" path="/var/lib/kubelet/pods/5b5702e4-96dd-479b-871a-d69bfdba91e1/volumes" Feb 02 13:04:46 crc kubenswrapper[4721]: I0202 13:04:46.473292 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"b87a5ff0b46a353772eefb47418659c5151aa34831e5ecd0ccf0f601bac1ccfe"} Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.799855 4721 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.800900 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5" gracePeriod=15 Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.801048 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e" gracePeriod=15 Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.801107 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f" gracePeriod=15 Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.801139 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06" gracePeriod=15 Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.801167 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787" gracePeriod=15 Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.801890 4721 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802272 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802286 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802297 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerName="extract-content" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802304 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerName="extract-content" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802319 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802349 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802364 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerName="registry-server" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802371 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerName="registry-server" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802383 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802390 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802399 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802405 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802432 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerName="extract-utilities" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802438 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerName="extract-utilities" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802447 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerName="extract-content" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802455 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerName="extract-content" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802463 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802469 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802476 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802503 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802517 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerName="extract-utilities" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802524 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerName="extract-utilities" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802530 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerName="registry-server" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802538 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerName="registry-server" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802724 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802764 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca822b5-78c8-47d9-9cc5-266118a2b5aa" containerName="registry-server" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802777 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802785 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802831 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802848 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.802857 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5702e4-96dd-479b-871a-d69bfdba91e1" containerName="registry-server" Feb 02 13:05:03 crc kubenswrapper[4721]: E0202 13:05:03.802991 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.803001 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.803320 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.804357 4721 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.805097 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.815976 4721 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.842738 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.859726 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.859783 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.859839 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.859855 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.859870 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.859887 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.859913 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.859939 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961259 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961319 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961345 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961365 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961394 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961427 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961463 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961486 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961456 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961521 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961446 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961575 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961512 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961614 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961717 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:03 crc kubenswrapper[4721]: I0202 13:05:03.961806 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.138554 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:05:04 crc kubenswrapper[4721]: W0202 13:05:04.156312 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-bb9d2c698316f6af3480b619b2c5603728224c60aad632bbec2cec1c9b7ab9ae WatchSource:0}: Error finding container bb9d2c698316f6af3480b619b2c5603728224c60aad632bbec2cec1c9b7ab9ae: Status 404 returned error can't find the container with id bb9d2c698316f6af3480b619b2c5603728224c60aad632bbec2cec1c9b7ab9ae Feb 02 13:05:04 crc kubenswrapper[4721]: E0202 13:05:04.159439 4721 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.247:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18906fbb6280333d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 13:05:04.158741309 +0000 UTC m=+244.461255698,LastTimestamp:2026-02-02 13:05:04.158741309 +0000 UTC m=+244.461255698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.588737 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ed7c7e2a904888901a446739bb41ce209fe73c11466f51414b1b4e5af67c4884"} Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.588781 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bb9d2c698316f6af3480b619b2c5603728224c60aad632bbec2cec1c9b7ab9ae"} Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.589817 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.591595 4721 generic.go:334] "Generic (PLEG): container finished" podID="9755d24d-ee48-44a4-aa63-5b014999e3a9" containerID="ee2400de463f4d9467d730014ae34268bf8cc03fd437b95ba162e6c9f10ba80b" exitCode=0 Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.591652 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9755d24d-ee48-44a4-aa63-5b014999e3a9","Type":"ContainerDied","Data":"ee2400de463f4d9467d730014ae34268bf8cc03fd437b95ba162e6c9f10ba80b"} Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.592086 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.592322 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.594401 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.595828 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.596552 4721 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e" exitCode=0 Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.596580 4721 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f" exitCode=0 Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.596614 4721 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06" exitCode=0 Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.596626 4721 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787" exitCode=2 Feb 02 13:05:04 crc kubenswrapper[4721]: I0202 13:05:04.596667 4721 scope.go:117] "RemoveContainer" containerID="db2bb86343542417950b2e7261cdad74cc3b3f05b9d471f222509715ac091a0d" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.353960 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" containerName="oauth-openshift" containerID="cri-o://81f1ed113ff28d261e45b8f089ff55331cfa48c5e28c0300ad3ede4e1aa70b95" gracePeriod=15 Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.609501 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.611962 4721 generic.go:334] "Generic (PLEG): container finished" podID="962524c6-7992-43d5-a7f3-5fdd04297f24" containerID="81f1ed113ff28d261e45b8f089ff55331cfa48c5e28c0300ad3ede4e1aa70b95" exitCode=0 Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.612265 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" event={"ID":"962524c6-7992-43d5-a7f3-5fdd04297f24","Type":"ContainerDied","Data":"81f1ed113ff28d261e45b8f089ff55331cfa48c5e28c0300ad3ede4e1aa70b95"} Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.720672 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.726324 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.726708 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.726917 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.844906 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.845616 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.846105 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.846421 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.891695 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-dir\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892249 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-session\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892294 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-error\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892344 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-login\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892380 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-router-certs\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892260 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892406 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-idp-0-file-data\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892463 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-provider-selection\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892505 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-cliconfig\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892546 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l4n2\" (UniqueName: \"kubernetes.io/projected/962524c6-7992-43d5-a7f3-5fdd04297f24-kube-api-access-4l4n2\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892582 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-trusted-ca-bundle\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892614 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-serving-cert\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892665 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-ocp-branding-template\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892744 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-policies\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.892825 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-service-ca\") pod \"962524c6-7992-43d5-a7f3-5fdd04297f24\" (UID: \"962524c6-7992-43d5-a7f3-5fdd04297f24\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.893287 4721 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.895350 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.895999 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.899347 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.900806 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.902915 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.903254 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.912607 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/962524c6-7992-43d5-a7f3-5fdd04297f24-kube-api-access-4l4n2" (OuterVolumeSpecName: "kube-api-access-4l4n2") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "kube-api-access-4l4n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.931615 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.932728 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.933683 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.934114 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.944311 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.947427 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "962524c6-7992-43d5-a7f3-5fdd04297f24" (UID: "962524c6-7992-43d5-a7f3-5fdd04297f24"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995005 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-kubelet-dir\") pod \"9755d24d-ee48-44a4-aa63-5b014999e3a9\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995189 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9755d24d-ee48-44a4-aa63-5b014999e3a9" (UID: "9755d24d-ee48-44a4-aa63-5b014999e3a9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995222 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9755d24d-ee48-44a4-aa63-5b014999e3a9-kube-api-access\") pod \"9755d24d-ee48-44a4-aa63-5b014999e3a9\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995266 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-var-lock\") pod \"9755d24d-ee48-44a4-aa63-5b014999e3a9\" (UID: \"9755d24d-ee48-44a4-aa63-5b014999e3a9\") " Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995483 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-var-lock" (OuterVolumeSpecName: "var-lock") pod "9755d24d-ee48-44a4-aa63-5b014999e3a9" (UID: "9755d24d-ee48-44a4-aa63-5b014999e3a9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995524 4721 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995641 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995669 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995686 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l4n2\" (UniqueName: \"kubernetes.io/projected/962524c6-7992-43d5-a7f3-5fdd04297f24-kube-api-access-4l4n2\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995700 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995717 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995730 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995748 4721 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995759 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995772 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995784 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995795 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995807 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:05 crc kubenswrapper[4721]: I0202 13:05:05.995818 4721 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/962524c6-7992-43d5-a7f3-5fdd04297f24-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.007864 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9755d24d-ee48-44a4-aa63-5b014999e3a9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9755d24d-ee48-44a4-aa63-5b014999e3a9" (UID: "9755d24d-ee48-44a4-aa63-5b014999e3a9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.097782 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9755d24d-ee48-44a4-aa63-5b014999e3a9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.097835 4721 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9755d24d-ee48-44a4-aa63-5b014999e3a9-var-lock\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.189000 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.190349 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.191148 4721 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.191636 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.192191 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.192518 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.300653 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.301062 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.300787 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.301114 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.301144 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.301155 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.301361 4721 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.301376 4721 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.301385 4721 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.415237 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.623956 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.625147 4721 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5" exitCode=0 Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.625233 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.625311 4721 scope.go:117] "RemoveContainer" containerID="15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.627151 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.627919 4721 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.628167 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.628302 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.628397 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.628693 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.628752 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" event={"ID":"962524c6-7992-43d5-a7f3-5fdd04297f24","Type":"ContainerDied","Data":"e1c6b11699215c240779ba4ffc084b0f044db3750d6c816f2d805a78f36b24e5"} Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.628859 4721 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.629233 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.629646 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.630740 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.631017 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.632083 4721 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.633160 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.633532 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"9755d24d-ee48-44a4-aa63-5b014999e3a9","Type":"ContainerDied","Data":"77bc31793d5cf8c9c2a6f1e7dc74fa1124392a5cef395c56ee4646291d25f2eb"} Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.633605 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77bc31793d5cf8c9c2a6f1e7dc74fa1124392a5cef395c56ee4646291d25f2eb" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.633564 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.635549 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.636378 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.636805 4721 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.637111 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.637551 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.638107 4721 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.638426 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.638685 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.644180 4721 scope.go:117] "RemoveContainer" containerID="63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.661650 4721 scope.go:117] "RemoveContainer" containerID="64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.677280 4721 scope.go:117] "RemoveContainer" containerID="f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.691085 4721 scope.go:117] "RemoveContainer" containerID="5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.708638 4721 scope.go:117] "RemoveContainer" containerID="465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.734578 4721 scope.go:117] "RemoveContainer" containerID="15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e" Feb 02 13:05:06 crc kubenswrapper[4721]: E0202 13:05:06.735281 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\": container with ID starting with 15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e not found: ID does not exist" containerID="15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.735359 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e"} err="failed to get container status \"15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\": rpc error: code = NotFound desc = could not find container \"15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e\": container with ID starting with 15b88ed26cc224bfef04033a527dfc61b8cbb02ee20c8eeed61becaccc11026e not found: ID does not exist" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.735410 4721 scope.go:117] "RemoveContainer" containerID="63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f" Feb 02 13:05:06 crc kubenswrapper[4721]: E0202 13:05:06.736124 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\": container with ID starting with 63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f not found: ID does not exist" containerID="63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.736153 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f"} err="failed to get container status \"63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\": rpc error: code = NotFound desc = could not find container \"63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f\": container with ID starting with 63082a40cefc099dbdfa1d78cbb94c4c5e2f4be2bb97c3d95f8a0b198404811f not found: ID does not exist" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.736170 4721 scope.go:117] "RemoveContainer" containerID="64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06" Feb 02 13:05:06 crc kubenswrapper[4721]: E0202 13:05:06.736656 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\": container with ID starting with 64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06 not found: ID does not exist" containerID="64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.736747 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06"} err="failed to get container status \"64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\": rpc error: code = NotFound desc = could not find container \"64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06\": container with ID starting with 64df3044205a42c6ecd41f11a393c6bf1c2db8e9f91a451143d99a8ffb442f06 not found: ID does not exist" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.736833 4721 scope.go:117] "RemoveContainer" containerID="f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787" Feb 02 13:05:06 crc kubenswrapper[4721]: E0202 13:05:06.737267 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\": container with ID starting with f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787 not found: ID does not exist" containerID="f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.737327 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787"} err="failed to get container status \"f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\": rpc error: code = NotFound desc = could not find container \"f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787\": container with ID starting with f1623f9ea27423fd810955fe8cedbd4d7a6689b71b1639406f25d6173465f787 not found: ID does not exist" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.737365 4721 scope.go:117] "RemoveContainer" containerID="5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5" Feb 02 13:05:06 crc kubenswrapper[4721]: E0202 13:05:06.737791 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\": container with ID starting with 5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5 not found: ID does not exist" containerID="5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.737817 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5"} err="failed to get container status \"5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\": rpc error: code = NotFound desc = could not find container \"5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5\": container with ID starting with 5d3d2b7df6931f13d24fc821fc0893006a6501b48be95e1d99225c3862f842d5 not found: ID does not exist" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.737840 4721 scope.go:117] "RemoveContainer" containerID="465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c" Feb 02 13:05:06 crc kubenswrapper[4721]: E0202 13:05:06.738181 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\": container with ID starting with 465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c not found: ID does not exist" containerID="465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.738214 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c"} err="failed to get container status \"465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\": rpc error: code = NotFound desc = could not find container \"465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c\": container with ID starting with 465986dbfff5026df7aa5a90471e79a8d3c101e0333d8065c1cffc6efa04015c not found: ID does not exist" Feb 02 13:05:06 crc kubenswrapper[4721]: I0202 13:05:06.738236 4721 scope.go:117] "RemoveContainer" containerID="81f1ed113ff28d261e45b8f089ff55331cfa48c5e28c0300ad3ede4e1aa70b95" Feb 02 13:05:09 crc kubenswrapper[4721]: E0202 13:05:09.668439 4721 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:09 crc kubenswrapper[4721]: E0202 13:05:09.669228 4721 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:09 crc kubenswrapper[4721]: E0202 13:05:09.669504 4721 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:09 crc kubenswrapper[4721]: E0202 13:05:09.669779 4721 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:09 crc kubenswrapper[4721]: E0202 13:05:09.670032 4721 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:09 crc kubenswrapper[4721]: I0202 13:05:09.670057 4721 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 02 13:05:09 crc kubenswrapper[4721]: E0202 13:05:09.670334 4721 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" interval="200ms" Feb 02 13:05:09 crc kubenswrapper[4721]: E0202 13:05:09.871336 4721 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" interval="400ms" Feb 02 13:05:10 crc kubenswrapper[4721]: E0202 13:05:10.272667 4721 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" interval="800ms" Feb 02 13:05:10 crc kubenswrapper[4721]: I0202 13:05:10.413135 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:10 crc kubenswrapper[4721]: I0202 13:05:10.413888 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:10 crc kubenswrapper[4721]: I0202 13:05:10.414297 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:10 crc kubenswrapper[4721]: E0202 13:05:10.453553 4721 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.247:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18906fbb6280333d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 13:05:04.158741309 +0000 UTC m=+244.461255698,LastTimestamp:2026-02-02 13:05:04.158741309 +0000 UTC m=+244.461255698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 13:05:11 crc kubenswrapper[4721]: E0202 13:05:11.073475 4721 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" interval="1.6s" Feb 02 13:05:12 crc kubenswrapper[4721]: E0202 13:05:12.674685 4721 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" interval="3.2s" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.409699 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.410687 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.410910 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.411420 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.425455 4721 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.425850 4721 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:15 crc kubenswrapper[4721]: E0202 13:05:15.426309 4721 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.426738 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.680058 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1c2e6eb6f67afe25e98ac97691c60139417e30f5b3496fe487baafd1770cd7e2"} Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.680131 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6e1767853e47e04cbbcbdebe254c65faac594cdfc7d46260a0341dbe5bf5a195"} Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.680419 4721 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.680441 4721 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:15 crc kubenswrapper[4721]: E0202 13:05:15.680829 4721 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.680885 4721 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.681246 4721 status_manager.go:851] "Failed to get status for pod" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:15 crc kubenswrapper[4721]: I0202 13:05:15.681699 4721 status_manager.go:851] "Failed to get status for pod" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" pod="openshift-authentication/oauth-openshift-558db77b4-fqbhq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-fqbhq\": dial tcp 38.129.56.247:6443: connect: connection refused" Feb 02 13:05:15 crc kubenswrapper[4721]: E0202 13:05:15.875911 4721 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.247:6443: connect: connection refused" interval="6.4s" Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.689405 4721 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1c2e6eb6f67afe25e98ac97691c60139417e30f5b3496fe487baafd1770cd7e2" exitCode=0 Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.689495 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1c2e6eb6f67afe25e98ac97691c60139417e30f5b3496fe487baafd1770cd7e2"} Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.689789 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8feeaa9e603205dae86de9199b6132b1ba82ba117e8a4943cc3f2adc19175757"} Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.689804 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e117ae8029ca07abd811aacfa5b46d99b19f0719fa797e122926af43114dbc40"} Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.689814 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f75a7ea60003f2d17e897e77f706085400f675d9f5d74bb0b62ba9f9f02620da"} Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.689826 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d530490a660223c99ba2db0d44e273ac9ee98b2ce67ba36da894a4c22646431e"} Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.693011 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.693100 4721 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f" exitCode=1 Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.693141 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f"} Feb 02 13:05:16 crc kubenswrapper[4721]: I0202 13:05:16.693722 4721 scope.go:117] "RemoveContainer" containerID="35ad9a6eb5018e783ee5445aa61ea8e1d030f3fd0b1d6a3007ed1ed55b1b0f0f" Feb 02 13:05:17 crc kubenswrapper[4721]: I0202 13:05:17.706551 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 02 13:05:17 crc kubenswrapper[4721]: I0202 13:05:17.706637 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f7aabb411739ce80ab47c668025392aa9ac42d7b38f7c431572198e251ab9866"} Feb 02 13:05:17 crc kubenswrapper[4721]: I0202 13:05:17.710600 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d3e0bb60e40afe56a1ecd42e4dbeddf5e56a2c15f8bfaec75e826492cd09de47"} Feb 02 13:05:17 crc kubenswrapper[4721]: I0202 13:05:17.710839 4721 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:17 crc kubenswrapper[4721]: I0202 13:05:17.710851 4721 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:17 crc kubenswrapper[4721]: I0202 13:05:17.711088 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:18 crc kubenswrapper[4721]: I0202 13:05:18.398398 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 13:05:20 crc kubenswrapper[4721]: I0202 13:05:20.427534 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:20 crc kubenswrapper[4721]: I0202 13:05:20.427916 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:20 crc kubenswrapper[4721]: I0202 13:05:20.433494 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:22 crc kubenswrapper[4721]: I0202 13:05:22.723219 4721 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:23 crc kubenswrapper[4721]: I0202 13:05:23.739714 4721 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:23 crc kubenswrapper[4721]: I0202 13:05:23.740094 4721 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:23 crc kubenswrapper[4721]: I0202 13:05:23.744171 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:23 crc kubenswrapper[4721]: I0202 13:05:23.746446 4721 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4546d4e4-ab43-4ddd-8a17-31d5868b4a79" Feb 02 13:05:24 crc kubenswrapper[4721]: I0202 13:05:24.015632 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 13:05:24 crc kubenswrapper[4721]: I0202 13:05:24.019745 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 13:05:24 crc kubenswrapper[4721]: I0202 13:05:24.744473 4721 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:24 crc kubenswrapper[4721]: I0202 13:05:24.744508 4721 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:28 crc kubenswrapper[4721]: I0202 13:05:28.402642 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 13:05:28 crc kubenswrapper[4721]: I0202 13:05:28.957987 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 02 13:05:29 crc kubenswrapper[4721]: I0202 13:05:29.955639 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 02 13:05:30 crc kubenswrapper[4721]: I0202 13:05:30.044373 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 02 13:05:30 crc kubenswrapper[4721]: I0202 13:05:30.129059 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 02 13:05:30 crc kubenswrapper[4721]: I0202 13:05:30.427716 4721 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4546d4e4-ab43-4ddd-8a17-31d5868b4a79" Feb 02 13:05:30 crc kubenswrapper[4721]: I0202 13:05:30.631046 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 02 13:05:30 crc kubenswrapper[4721]: I0202 13:05:30.674052 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 02 13:05:30 crc kubenswrapper[4721]: I0202 13:05:30.842355 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 02 13:05:30 crc kubenswrapper[4721]: I0202 13:05:30.850941 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 02 13:05:30 crc kubenswrapper[4721]: I0202 13:05:30.867355 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 02 13:05:31 crc kubenswrapper[4721]: I0202 13:05:31.269358 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 02 13:05:31 crc kubenswrapper[4721]: I0202 13:05:31.571574 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 02 13:05:31 crc kubenswrapper[4721]: I0202 13:05:31.637228 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 02 13:05:31 crc kubenswrapper[4721]: I0202 13:05:31.833258 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 02 13:05:32 crc kubenswrapper[4721]: I0202 13:05:32.075285 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 02 13:05:32 crc kubenswrapper[4721]: I0202 13:05:32.342396 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 02 13:05:32 crc kubenswrapper[4721]: I0202 13:05:32.362624 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 02 13:05:32 crc kubenswrapper[4721]: I0202 13:05:32.660403 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.095339 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.196650 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.280252 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.431746 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.482832 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.562277 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.686092 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.773138 4721 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.773235 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 02 13:05:34 crc kubenswrapper[4721]: I0202 13:05:34.942017 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 02 13:05:35 crc kubenswrapper[4721]: I0202 13:05:35.051049 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 02 13:05:35 crc kubenswrapper[4721]: I0202 13:05:35.329202 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 02 13:05:35 crc kubenswrapper[4721]: I0202 13:05:35.341102 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 02 13:05:35 crc kubenswrapper[4721]: I0202 13:05:35.447434 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 02 13:05:35 crc kubenswrapper[4721]: I0202 13:05:35.572232 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 02 13:05:35 crc kubenswrapper[4721]: I0202 13:05:35.812807 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 02 13:05:35 crc kubenswrapper[4721]: I0202 13:05:35.867820 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.000053 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.127241 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.220584 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.352485 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.582957 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.662372 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.667196 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.682974 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.822578 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.888387 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 02 13:05:36 crc kubenswrapper[4721]: I0202 13:05:36.948359 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.072697 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.111932 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.144529 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.155979 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.305906 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.486299 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.558523 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.619619 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.654752 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.793700 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.885467 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.895174 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 02 13:05:37 crc kubenswrapper[4721]: I0202 13:05:37.966033 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.042654 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.173699 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.197474 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.219971 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.252997 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.341547 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.436173 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.467131 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.468564 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.487166 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.654360 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.720369 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.734191 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.807709 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.893821 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 02 13:05:38 crc kubenswrapper[4721]: I0202 13:05:38.986912 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.162686 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.213806 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.229791 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.375259 4721 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.424015 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.493501 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.534408 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.607124 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.664693 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.728751 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.760330 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.769766 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.875013 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.945609 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 02 13:05:39 crc kubenswrapper[4721]: I0202 13:05:39.994454 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 02 13:05:40 crc kubenswrapper[4721]: I0202 13:05:40.031938 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 02 13:05:40 crc kubenswrapper[4721]: I0202 13:05:40.658514 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 02 13:05:40 crc kubenswrapper[4721]: I0202 13:05:40.729049 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 02 13:05:40 crc kubenswrapper[4721]: I0202 13:05:40.789587 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 02 13:05:40 crc kubenswrapper[4721]: I0202 13:05:40.949238 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 02 13:05:40 crc kubenswrapper[4721]: I0202 13:05:40.987512 4721 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 02 13:05:40 crc kubenswrapper[4721]: I0202 13:05:40.987849 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.004264 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.122391 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.159245 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.301949 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.359150 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.371343 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.478570 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.546268 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.557825 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.579250 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.590927 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.606979 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.666025 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.689435 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.701058 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.729197 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.802556 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.811929 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.889395 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.890038 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 02 13:05:41 crc kubenswrapper[4721]: I0202 13:05:41.942577 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.149239 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.212913 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.235922 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.343316 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.614923 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.697429 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.705740 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.754414 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.788226 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.939766 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 02 13:05:42 crc kubenswrapper[4721]: I0202 13:05:42.956939 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.135045 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.195061 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.253914 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.425237 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.474388 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.500430 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.500630 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.662470 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.677022 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.725291 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.736858 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.806345 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.811239 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.830841 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.851732 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 02 13:05:43 crc kubenswrapper[4721]: I0202 13:05:43.875922 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.072201 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.074297 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.264173 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.351986 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.356766 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.397398 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.450290 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.451109 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.558016 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.653183 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.662619 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.679270 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.689859 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.721740 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.738592 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.774330 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.824269 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.934229 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 02 13:05:44 crc kubenswrapper[4721]: I0202 13:05:44.951366 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.037848 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.076754 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.126727 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.144005 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.167036 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.257113 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.286492 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.370874 4721 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.390634 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.540596 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.590334 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.652987 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.747744 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.748880 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.806772 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.848714 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.878621 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 02 13:05:45 crc kubenswrapper[4721]: I0202 13:05:45.931383 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.265801 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.339761 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.370433 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.412813 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.454433 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.574830 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.585088 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.620778 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.694299 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.866343 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.943590 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.992039 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 02 13:05:46 crc kubenswrapper[4721]: I0202 13:05:46.992859 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.004992 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.138590 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.193119 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.196432 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.260267 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.364852 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.487844 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.498944 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.810806 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.866868 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.875336 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.896107 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 02 13:05:47 crc kubenswrapper[4721]: I0202 13:05:47.947205 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.034081 4721 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.179366 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.272173 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.272182 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.299494 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.400172 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.464422 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.546741 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.569472 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.595118 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.608547 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.684516 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.793199 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.849244 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.906636 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 02 13:05:48 crc kubenswrapper[4721]: I0202 13:05:48.995475 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.365319 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.370720 4721 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.373162 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=46.373143571 podStartE2EDuration="46.373143571s" podCreationTimestamp="2026-02-02 13:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:05:22.49237574 +0000 UTC m=+262.794890129" watchObservedRunningTime="2026-02-02 13:05:49.373143571 +0000 UTC m=+289.675657990" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.376285 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fqbhq","openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.376406 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-76fc545986-xl9jq","openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 13:05:49 crc kubenswrapper[4721]: E0202 13:05:49.376680 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" containerName="oauth-openshift" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.376699 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" containerName="oauth-openshift" Feb 02 13:05:49 crc kubenswrapper[4721]: E0202 13:05:49.376716 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" containerName="installer" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.376725 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" containerName="installer" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.376833 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="9755d24d-ee48-44a4-aa63-5b014999e3a9" containerName="installer" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.376847 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" containerName="oauth-openshift" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.377164 4721 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.377218 4721 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c0939893-cc01-45bf-844d-77d599d4d0a4" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.377449 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.379883 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.380906 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.380962 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.381153 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.381440 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.381537 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.381714 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.381988 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.382132 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.382171 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.382412 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.382691 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.383384 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.385404 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.388818 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.392680 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.396434 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.430695 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=27.430603448 podStartE2EDuration="27.430603448s" podCreationTimestamp="2026-02-02 13:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:05:49.419568303 +0000 UTC m=+289.722082692" watchObservedRunningTime="2026-02-02 13:05:49.430603448 +0000 UTC m=+289.733117837" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.481099 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.528828 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.528926 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-router-certs\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.528967 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529412 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-audit-policies\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529494 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529526 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-template-login\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529547 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529589 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-template-error\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529666 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a219d6f3-ca00-4d64-9283-25b7502567c1-audit-dir\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529722 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529749 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-service-ca\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529768 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47gjz\" (UniqueName: \"kubernetes.io/projected/a219d6f3-ca00-4d64-9283-25b7502567c1-kube-api-access-47gjz\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529862 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-session\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.529882 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.630891 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.631019 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-audit-policies\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.631050 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632021 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-audit-policies\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632170 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-template-login\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632512 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632203 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632631 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-template-error\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632680 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a219d6f3-ca00-4d64-9283-25b7502567c1-audit-dir\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632712 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632740 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-service-ca\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632765 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47gjz\" (UniqueName: \"kubernetes.io/projected/a219d6f3-ca00-4d64-9283-25b7502567c1-kube-api-access-47gjz\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632813 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632834 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-session\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632866 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.632892 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-router-certs\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.633538 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.633606 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a219d6f3-ca00-4d64-9283-25b7502567c1-audit-dir\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.634026 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-service-ca\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.636436 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.638169 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-router-certs\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.638432 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.638473 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-template-error\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.638594 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.638742 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-template-login\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.643274 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.643833 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.644014 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a219d6f3-ca00-4d64-9283-25b7502567c1-v4-0-config-system-session\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.651767 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47gjz\" (UniqueName: \"kubernetes.io/projected/a219d6f3-ca00-4d64-9283-25b7502567c1-kube-api-access-47gjz\") pod \"oauth-openshift-76fc545986-xl9jq\" (UID: \"a219d6f3-ca00-4d64-9283-25b7502567c1\") " pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.697851 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.758628 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.761017 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 02 13:05:49 crc kubenswrapper[4721]: I0202 13:05:49.912664 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.099715 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76fc545986-xl9jq"] Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.177918 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.413433 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.421180 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="962524c6-7992-43d5-a7f3-5fdd04297f24" path="/var/lib/kubelet/pods/962524c6-7992-43d5-a7f3-5fdd04297f24/volumes" Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.565314 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.656301 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.883977 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" event={"ID":"a219d6f3-ca00-4d64-9283-25b7502567c1","Type":"ContainerStarted","Data":"35ee26dd46aeb9522e621c0a21c0e31ab749f5b9c5838ddb9d93f5a7579ed1d2"} Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.884024 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" event={"ID":"a219d6f3-ca00-4d64-9283-25b7502567c1","Type":"ContainerStarted","Data":"60f69fc4db1373075ad2c602abd354831ef2372b4e324c7c6d804c6454e0bcc6"} Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.885008 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.890664 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" Feb 02 13:05:50 crc kubenswrapper[4721]: I0202 13:05:50.905161 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-76fc545986-xl9jq" podStartSLOduration=70.9051428 podStartE2EDuration="1m10.9051428s" podCreationTimestamp="2026-02-02 13:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:05:50.90226177 +0000 UTC m=+291.204776159" watchObservedRunningTime="2026-02-02 13:05:50.9051428 +0000 UTC m=+291.207657199" Feb 02 13:05:51 crc kubenswrapper[4721]: I0202 13:05:51.351475 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 02 13:05:52 crc kubenswrapper[4721]: I0202 13:05:52.126134 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 02 13:05:56 crc kubenswrapper[4721]: I0202 13:05:56.483883 4721 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 13:05:56 crc kubenswrapper[4721]: I0202 13:05:56.484475 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ed7c7e2a904888901a446739bb41ce209fe73c11466f51414b1b4e5af67c4884" gracePeriod=5 Feb 02 13:06:00 crc kubenswrapper[4721]: I0202 13:06:00.217305 4721 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 02 13:06:01 crc kubenswrapper[4721]: I0202 13:06:01.949370 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 02 13:06:01 crc kubenswrapper[4721]: I0202 13:06:01.949715 4721 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ed7c7e2a904888901a446739bb41ce209fe73c11466f51414b1b4e5af67c4884" exitCode=137 Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.101182 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.101280 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.292721 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.292853 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.292883 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.292952 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.292979 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.292976 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.293031 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.293051 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.293179 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.293233 4721 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.293248 4721 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.293261 4721 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.316345 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.394087 4721 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.394149 4721 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.416840 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.417118 4721 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.430113 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.430169 4721 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="92c983c1-4477-4a0d-ad15-0fb76214c795" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.432017 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.432089 4721 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="92c983c1-4477-4a0d-ad15-0fb76214c795" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.956019 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.956147 4721 scope.go:117] "RemoveContainer" containerID="ed7c7e2a904888901a446739bb41ce209fe73c11466f51414b1b4e5af67c4884" Feb 02 13:06:02 crc kubenswrapper[4721]: I0202 13:06:02.956308 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 13:06:07 crc kubenswrapper[4721]: I0202 13:06:07.997975 4721 generic.go:334] "Generic (PLEG): container finished" podID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerID="6e95f003df211d09b9562e86431541c3b7c3e84c41d01ea470d07b5cb914180b" exitCode=0 Feb 02 13:06:07 crc kubenswrapper[4721]: I0202 13:06:07.998098 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" event={"ID":"2c9074bc-889d-4ce7-a250-6fc5984703e0","Type":"ContainerDied","Data":"6e95f003df211d09b9562e86431541c3b7c3e84c41d01ea470d07b5cb914180b"} Feb 02 13:06:08 crc kubenswrapper[4721]: I0202 13:06:07.999640 4721 scope.go:117] "RemoveContainer" containerID="6e95f003df211d09b9562e86431541c3b7c3e84c41d01ea470d07b5cb914180b" Feb 02 13:06:09 crc kubenswrapper[4721]: I0202 13:06:09.005948 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" event={"ID":"2c9074bc-889d-4ce7-a250-6fc5984703e0","Type":"ContainerStarted","Data":"598f0ede0e5be4e9da5dde9217b5303de54f203ff19e55b638f8757dabfd9366"} Feb 02 13:06:09 crc kubenswrapper[4721]: I0202 13:06:09.006338 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:06:09 crc kubenswrapper[4721]: I0202 13:06:09.007507 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:06:12 crc kubenswrapper[4721]: I0202 13:06:12.843005 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ffkjd"] Feb 02 13:06:12 crc kubenswrapper[4721]: I0202 13:06:12.843968 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" podUID="3c0670a6-888e-40e3-bf5d-82779e70dd1c" containerName="controller-manager" containerID="cri-o://d435dac504fe3034e5527129f376c6ed65b5ea3e1fe83d1eb8463d6282795a18" gracePeriod=30 Feb 02 13:06:12 crc kubenswrapper[4721]: I0202 13:06:12.934226 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f"] Feb 02 13:06:12 crc kubenswrapper[4721]: I0202 13:06:12.934757 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" podUID="8f1e834f-23b5-42a5-9d13-b9e5720a597c" containerName="route-controller-manager" containerID="cri-o://e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329" gracePeriod=30 Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.030162 4721 generic.go:334] "Generic (PLEG): container finished" podID="3c0670a6-888e-40e3-bf5d-82779e70dd1c" containerID="d435dac504fe3034e5527129f376c6ed65b5ea3e1fe83d1eb8463d6282795a18" exitCode=0 Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.030217 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" event={"ID":"3c0670a6-888e-40e3-bf5d-82779e70dd1c","Type":"ContainerDied","Data":"d435dac504fe3034e5527129f376c6ed65b5ea3e1fe83d1eb8463d6282795a18"} Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.337670 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.344258 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.444193 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0670a6-888e-40e3-bf5d-82779e70dd1c-serving-cert\") pod \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.444273 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-config\") pod \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.444323 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-client-ca\") pod \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.444390 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nff5\" (UniqueName: \"kubernetes.io/projected/3c0670a6-888e-40e3-bf5d-82779e70dd1c-kube-api-access-4nff5\") pod \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.444631 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-proxy-ca-bundles\") pod \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\" (UID: \"3c0670a6-888e-40e3-bf5d-82779e70dd1c\") " Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.445136 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3c0670a6-888e-40e3-bf5d-82779e70dd1c" (UID: "3c0670a6-888e-40e3-bf5d-82779e70dd1c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.445487 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-config" (OuterVolumeSpecName: "config") pod "3c0670a6-888e-40e3-bf5d-82779e70dd1c" (UID: "3c0670a6-888e-40e3-bf5d-82779e70dd1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.447427 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-client-ca" (OuterVolumeSpecName: "client-ca") pod "3c0670a6-888e-40e3-bf5d-82779e70dd1c" (UID: "3c0670a6-888e-40e3-bf5d-82779e70dd1c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.447841 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.447866 4721 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.447879 4721 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c0670a6-888e-40e3-bf5d-82779e70dd1c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.451588 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0670a6-888e-40e3-bf5d-82779e70dd1c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3c0670a6-888e-40e3-bf5d-82779e70dd1c" (UID: "3c0670a6-888e-40e3-bf5d-82779e70dd1c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.451854 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0670a6-888e-40e3-bf5d-82779e70dd1c-kube-api-access-4nff5" (OuterVolumeSpecName: "kube-api-access-4nff5") pod "3c0670a6-888e-40e3-bf5d-82779e70dd1c" (UID: "3c0670a6-888e-40e3-bf5d-82779e70dd1c"). InnerVolumeSpecName "kube-api-access-4nff5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.548356 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-client-ca\") pod \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.548411 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-config\") pod \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.548444 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1e834f-23b5-42a5-9d13-b9e5720a597c-serving-cert\") pod \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.548488 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2p87\" (UniqueName: \"kubernetes.io/projected/8f1e834f-23b5-42a5-9d13-b9e5720a597c-kube-api-access-w2p87\") pod \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\" (UID: \"8f1e834f-23b5-42a5-9d13-b9e5720a597c\") " Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.548614 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c0670a6-888e-40e3-bf5d-82779e70dd1c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.548629 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nff5\" (UniqueName: \"kubernetes.io/projected/3c0670a6-888e-40e3-bf5d-82779e70dd1c-kube-api-access-4nff5\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.549497 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-client-ca" (OuterVolumeSpecName: "client-ca") pod "8f1e834f-23b5-42a5-9d13-b9e5720a597c" (UID: "8f1e834f-23b5-42a5-9d13-b9e5720a597c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.549684 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-config" (OuterVolumeSpecName: "config") pod "8f1e834f-23b5-42a5-9d13-b9e5720a597c" (UID: "8f1e834f-23b5-42a5-9d13-b9e5720a597c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.551926 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f1e834f-23b5-42a5-9d13-b9e5720a597c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8f1e834f-23b5-42a5-9d13-b9e5720a597c" (UID: "8f1e834f-23b5-42a5-9d13-b9e5720a597c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.551942 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f1e834f-23b5-42a5-9d13-b9e5720a597c-kube-api-access-w2p87" (OuterVolumeSpecName: "kube-api-access-w2p87") pod "8f1e834f-23b5-42a5-9d13-b9e5720a597c" (UID: "8f1e834f-23b5-42a5-9d13-b9e5720a597c"). InnerVolumeSpecName "kube-api-access-w2p87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.649313 4721 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.649360 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f1e834f-23b5-42a5-9d13-b9e5720a597c-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.649372 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f1e834f-23b5-42a5-9d13-b9e5720a597c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:13 crc kubenswrapper[4721]: I0202 13:06:13.649385 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2p87\" (UniqueName: \"kubernetes.io/projected/8f1e834f-23b5-42a5-9d13-b9e5720a597c-kube-api-access-w2p87\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.045177 4721 generic.go:334] "Generic (PLEG): container finished" podID="8f1e834f-23b5-42a5-9d13-b9e5720a597c" containerID="e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329" exitCode=0 Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.045239 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" event={"ID":"8f1e834f-23b5-42a5-9d13-b9e5720a597c","Type":"ContainerDied","Data":"e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329"} Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.045314 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" event={"ID":"8f1e834f-23b5-42a5-9d13-b9e5720a597c","Type":"ContainerDied","Data":"9dac14241b7592e3b43fe2d27aa1874f518d588eab3c2210074f031e8ca8e1b4"} Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.045378 4721 scope.go:117] "RemoveContainer" containerID="e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.045673 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.050170 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" event={"ID":"3c0670a6-888e-40e3-bf5d-82779e70dd1c","Type":"ContainerDied","Data":"167d1cdbeb24f93927ece3e3fa3df789c23a8308344e8f29012657e06e53e904"} Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.050308 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-ffkjd" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.068822 4721 scope.go:117] "RemoveContainer" containerID="e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329" Feb 02 13:06:14 crc kubenswrapper[4721]: E0202 13:06:14.071187 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329\": container with ID starting with e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329 not found: ID does not exist" containerID="e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.071246 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329"} err="failed to get container status \"e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329\": rpc error: code = NotFound desc = could not find container \"e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329\": container with ID starting with e50c92de6389f2984e09f1db3217014b56b2e9a4e2a44d5d7b627a3954d39329 not found: ID does not exist" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.071288 4721 scope.go:117] "RemoveContainer" containerID="d435dac504fe3034e5527129f376c6ed65b5ea3e1fe83d1eb8463d6282795a18" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.101363 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f"] Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.104861 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg49f"] Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.112714 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ffkjd"] Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.122422 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-ffkjd"] Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.251162 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5574d8cf7-r8p5p"] Feb 02 13:06:14 crc kubenswrapper[4721]: E0202 13:06:14.251394 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.251407 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 13:06:14 crc kubenswrapper[4721]: E0202 13:06:14.251419 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f1e834f-23b5-42a5-9d13-b9e5720a597c" containerName="route-controller-manager" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.251425 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f1e834f-23b5-42a5-9d13-b9e5720a597c" containerName="route-controller-manager" Feb 02 13:06:14 crc kubenswrapper[4721]: E0202 13:06:14.251440 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0670a6-888e-40e3-bf5d-82779e70dd1c" containerName="controller-manager" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.251447 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0670a6-888e-40e3-bf5d-82779e70dd1c" containerName="controller-manager" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.251530 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0670a6-888e-40e3-bf5d-82779e70dd1c" containerName="controller-manager" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.251540 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.251554 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f1e834f-23b5-42a5-9d13-b9e5720a597c" containerName="route-controller-manager" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.251874 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.254010 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.254053 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.254149 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.254197 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.254462 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.254616 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.263942 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht"] Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.265129 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.269502 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.269614 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.269502 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.269793 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.270688 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcgl9\" (UniqueName: \"kubernetes.io/projected/2e2b299a-d51b-4bd3-9707-c4a04579e04d-kube-api-access-qcgl9\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.270790 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-config\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.270819 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-client-ca\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.270862 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.271020 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.270920 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e2b299a-d51b-4bd3-9707-c4a04579e04d-serving-cert\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.271258 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-proxy-ca-bundles\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.273633 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.287116 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5574d8cf7-r8p5p"] Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.305291 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht"] Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.372327 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcgl9\" (UniqueName: \"kubernetes.io/projected/2e2b299a-d51b-4bd3-9707-c4a04579e04d-kube-api-access-qcgl9\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.372391 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3faddea-983f-4160-bd5d-0eb17dccf62f-serving-cert\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.372430 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3faddea-983f-4160-bd5d-0eb17dccf62f-config\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.372464 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-config\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.372492 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-client-ca\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.372533 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e2b299a-d51b-4bd3-9707-c4a04579e04d-serving-cert\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.372555 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdqf4\" (UniqueName: \"kubernetes.io/projected/c3faddea-983f-4160-bd5d-0eb17dccf62f-kube-api-access-xdqf4\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.372582 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-proxy-ca-bundles\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.372609 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3faddea-983f-4160-bd5d-0eb17dccf62f-client-ca\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.373835 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-proxy-ca-bundles\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.373864 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-client-ca\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.375935 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-config\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.381779 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e2b299a-d51b-4bd3-9707-c4a04579e04d-serving-cert\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.390645 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcgl9\" (UniqueName: \"kubernetes.io/projected/2e2b299a-d51b-4bd3-9707-c4a04579e04d-kube-api-access-qcgl9\") pod \"controller-manager-5574d8cf7-r8p5p\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.417558 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0670a6-888e-40e3-bf5d-82779e70dd1c" path="/var/lib/kubelet/pods/3c0670a6-888e-40e3-bf5d-82779e70dd1c/volumes" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.418224 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f1e834f-23b5-42a5-9d13-b9e5720a597c" path="/var/lib/kubelet/pods/8f1e834f-23b5-42a5-9d13-b9e5720a597c/volumes" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.473354 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3faddea-983f-4160-bd5d-0eb17dccf62f-serving-cert\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.474199 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3faddea-983f-4160-bd5d-0eb17dccf62f-config\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.475427 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3faddea-983f-4160-bd5d-0eb17dccf62f-config\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.475434 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdqf4\" (UniqueName: \"kubernetes.io/projected/c3faddea-983f-4160-bd5d-0eb17dccf62f-kube-api-access-xdqf4\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.475548 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3faddea-983f-4160-bd5d-0eb17dccf62f-client-ca\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.476666 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3faddea-983f-4160-bd5d-0eb17dccf62f-client-ca\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.478428 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3faddea-983f-4160-bd5d-0eb17dccf62f-serving-cert\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.502827 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdqf4\" (UniqueName: \"kubernetes.io/projected/c3faddea-983f-4160-bd5d-0eb17dccf62f-kube-api-access-xdqf4\") pod \"route-controller-manager-7944cd5597-n48ht\" (UID: \"c3faddea-983f-4160-bd5d-0eb17dccf62f\") " pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.601804 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.613107 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.845989 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht"] Feb 02 13:06:14 crc kubenswrapper[4721]: I0202 13:06:14.902104 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5574d8cf7-r8p5p"] Feb 02 13:06:15 crc kubenswrapper[4721]: I0202 13:06:15.060613 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" event={"ID":"c3faddea-983f-4160-bd5d-0eb17dccf62f","Type":"ContainerStarted","Data":"38aacc58cd27509baf6164a31e18b4d598488511522d77236da9c5fe7c5b5fe9"} Feb 02 13:06:15 crc kubenswrapper[4721]: I0202 13:06:15.061437 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" event={"ID":"2e2b299a-d51b-4bd3-9707-c4a04579e04d","Type":"ContainerStarted","Data":"d56f67b7dea16a1c0a8eddfdc1ce0e7b9e600642d46866721c9ef00924975ed4"} Feb 02 13:06:16 crc kubenswrapper[4721]: I0202 13:06:16.068819 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" event={"ID":"c3faddea-983f-4160-bd5d-0eb17dccf62f","Type":"ContainerStarted","Data":"bb4e5ab76660987913b2ae3c95246cb5a7af0eb7e0f0f7cf6fdef445c9b2b429"} Feb 02 13:06:16 crc kubenswrapper[4721]: I0202 13:06:16.069478 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:16 crc kubenswrapper[4721]: I0202 13:06:16.073977 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" event={"ID":"2e2b299a-d51b-4bd3-9707-c4a04579e04d","Type":"ContainerStarted","Data":"1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d"} Feb 02 13:06:16 crc kubenswrapper[4721]: I0202 13:06:16.074387 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:16 crc kubenswrapper[4721]: I0202 13:06:16.075428 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" Feb 02 13:06:16 crc kubenswrapper[4721]: I0202 13:06:16.087160 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:16 crc kubenswrapper[4721]: I0202 13:06:16.100381 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7944cd5597-n48ht" podStartSLOduration=2.100363128 podStartE2EDuration="2.100363128s" podCreationTimestamp="2026-02-02 13:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:06:16.09137621 +0000 UTC m=+316.393890609" watchObservedRunningTime="2026-02-02 13:06:16.100363128 +0000 UTC m=+316.402877517" Feb 02 13:06:16 crc kubenswrapper[4721]: I0202 13:06:16.140631 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" podStartSLOduration=2.14060627 podStartE2EDuration="2.14060627s" podCreationTimestamp="2026-02-02 13:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:06:16.136940759 +0000 UTC m=+316.439455168" watchObservedRunningTime="2026-02-02 13:06:16.14060627 +0000 UTC m=+316.443120679" Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.350809 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5574d8cf7-r8p5p"] Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.351368 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" podUID="2e2b299a-d51b-4bd3-9707-c4a04579e04d" containerName="controller-manager" containerID="cri-o://1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d" gracePeriod=30 Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.778804 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.952762 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e2b299a-d51b-4bd3-9707-c4a04579e04d-serving-cert\") pod \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.952832 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-config\") pod \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.952909 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-client-ca\") pod \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.952989 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-proxy-ca-bundles\") pod \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.953562 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2e2b299a-d51b-4bd3-9707-c4a04579e04d" (UID: "2e2b299a-d51b-4bd3-9707-c4a04579e04d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.953593 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-config" (OuterVolumeSpecName: "config") pod "2e2b299a-d51b-4bd3-9707-c4a04579e04d" (UID: "2e2b299a-d51b-4bd3-9707-c4a04579e04d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.953690 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-client-ca" (OuterVolumeSpecName: "client-ca") pod "2e2b299a-d51b-4bd3-9707-c4a04579e04d" (UID: "2e2b299a-d51b-4bd3-9707-c4a04579e04d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.953904 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcgl9\" (UniqueName: \"kubernetes.io/projected/2e2b299a-d51b-4bd3-9707-c4a04579e04d-kube-api-access-qcgl9\") pod \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\" (UID: \"2e2b299a-d51b-4bd3-9707-c4a04579e04d\") " Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.954251 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.954269 4721 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.954279 4721 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2e2b299a-d51b-4bd3-9707-c4a04579e04d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.959477 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e2b299a-d51b-4bd3-9707-c4a04579e04d-kube-api-access-qcgl9" (OuterVolumeSpecName: "kube-api-access-qcgl9") pod "2e2b299a-d51b-4bd3-9707-c4a04579e04d" (UID: "2e2b299a-d51b-4bd3-9707-c4a04579e04d"). InnerVolumeSpecName "kube-api-access-qcgl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:06:19 crc kubenswrapper[4721]: I0202 13:06:19.961443 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e2b299a-d51b-4bd3-9707-c4a04579e04d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2e2b299a-d51b-4bd3-9707-c4a04579e04d" (UID: "2e2b299a-d51b-4bd3-9707-c4a04579e04d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.055165 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcgl9\" (UniqueName: \"kubernetes.io/projected/2e2b299a-d51b-4bd3-9707-c4a04579e04d-kube-api-access-qcgl9\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.055199 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e2b299a-d51b-4bd3-9707-c4a04579e04d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.103396 4721 generic.go:334] "Generic (PLEG): container finished" podID="2e2b299a-d51b-4bd3-9707-c4a04579e04d" containerID="1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d" exitCode=0 Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.103435 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.103448 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" event={"ID":"2e2b299a-d51b-4bd3-9707-c4a04579e04d","Type":"ContainerDied","Data":"1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d"} Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.103479 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574d8cf7-r8p5p" event={"ID":"2e2b299a-d51b-4bd3-9707-c4a04579e04d","Type":"ContainerDied","Data":"d56f67b7dea16a1c0a8eddfdc1ce0e7b9e600642d46866721c9ef00924975ed4"} Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.103501 4721 scope.go:117] "RemoveContainer" containerID="1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.125172 4721 scope.go:117] "RemoveContainer" containerID="1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d" Feb 02 13:06:20 crc kubenswrapper[4721]: E0202 13:06:20.125609 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d\": container with ID starting with 1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d not found: ID does not exist" containerID="1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.125666 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d"} err="failed to get container status \"1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d\": rpc error: code = NotFound desc = could not find container \"1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d\": container with ID starting with 1aab96e4774bff0ac80b1ac1190055d2e982db0b177f1befdc8d948f25085b9d not found: ID does not exist" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.132983 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5574d8cf7-r8p5p"] Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.136260 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5574d8cf7-r8p5p"] Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.416348 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e2b299a-d51b-4bd3-9707-c4a04579e04d" path="/var/lib/kubelet/pods/2e2b299a-d51b-4bd3-9707-c4a04579e04d/volumes" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.466896 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-wq8j4"] Feb 02 13:06:20 crc kubenswrapper[4721]: E0202 13:06:20.467121 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e2b299a-d51b-4bd3-9707-c4a04579e04d" containerName="controller-manager" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.467135 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e2b299a-d51b-4bd3-9707-c4a04579e04d" containerName="controller-manager" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.467248 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e2b299a-d51b-4bd3-9707-c4a04579e04d" containerName="controller-manager" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.467609 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.470310 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.470673 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.470840 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.471671 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.472497 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.475606 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.476682 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.485840 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-wq8j4"] Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.661880 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-client-ca\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.661957 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n98g\" (UniqueName: \"kubernetes.io/projected/25a832ca-eec3-4483-b12a-4bc922d51326-kube-api-access-2n98g\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.661986 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-config\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.662003 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a832ca-eec3-4483-b12a-4bc922d51326-serving-cert\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.662037 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-proxy-ca-bundles\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.763520 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-client-ca\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.763840 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n98g\" (UniqueName: \"kubernetes.io/projected/25a832ca-eec3-4483-b12a-4bc922d51326-kube-api-access-2n98g\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.763869 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-config\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.763886 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a832ca-eec3-4483-b12a-4bc922d51326-serving-cert\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.763923 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-proxy-ca-bundles\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.764844 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-proxy-ca-bundles\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.765125 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-client-ca\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.765526 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-config\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.775444 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a832ca-eec3-4483-b12a-4bc922d51326-serving-cert\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.782868 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n98g\" (UniqueName: \"kubernetes.io/projected/25a832ca-eec3-4483-b12a-4bc922d51326-kube-api-access-2n98g\") pod \"controller-manager-77b768d98c-wq8j4\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:20 crc kubenswrapper[4721]: I0202 13:06:20.783184 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:21 crc kubenswrapper[4721]: I0202 13:06:21.012937 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-wq8j4"] Feb 02 13:06:21 crc kubenswrapper[4721]: W0202 13:06:21.017655 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25a832ca_eec3_4483_b12a_4bc922d51326.slice/crio-12aa00937c8b0761da5e1b6892ac57c359e37da61e443d450319dba4f1cacec8 WatchSource:0}: Error finding container 12aa00937c8b0761da5e1b6892ac57c359e37da61e443d450319dba4f1cacec8: Status 404 returned error can't find the container with id 12aa00937c8b0761da5e1b6892ac57c359e37da61e443d450319dba4f1cacec8 Feb 02 13:06:21 crc kubenswrapper[4721]: I0202 13:06:21.111038 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" event={"ID":"25a832ca-eec3-4483-b12a-4bc922d51326","Type":"ContainerStarted","Data":"12aa00937c8b0761da5e1b6892ac57c359e37da61e443d450319dba4f1cacec8"} Feb 02 13:06:22 crc kubenswrapper[4721]: I0202 13:06:22.123011 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" event={"ID":"25a832ca-eec3-4483-b12a-4bc922d51326","Type":"ContainerStarted","Data":"816a45936f6803b12e0ea97f51048b495653f4de87354e7bba1a4d4d4b951e79"} Feb 02 13:06:22 crc kubenswrapper[4721]: I0202 13:06:22.123606 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:22 crc kubenswrapper[4721]: I0202 13:06:22.130391 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:22 crc kubenswrapper[4721]: I0202 13:06:22.151277 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" podStartSLOduration=3.151247385 podStartE2EDuration="3.151247385s" podCreationTimestamp="2026-02-02 13:06:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:06:22.148285683 +0000 UTC m=+322.450800072" watchObservedRunningTime="2026-02-02 13:06:22.151247385 +0000 UTC m=+322.453761774" Feb 02 13:06:32 crc kubenswrapper[4721]: I0202 13:06:32.824051 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-wq8j4"] Feb 02 13:06:32 crc kubenswrapper[4721]: I0202 13:06:32.824871 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" podUID="25a832ca-eec3-4483-b12a-4bc922d51326" containerName="controller-manager" containerID="cri-o://816a45936f6803b12e0ea97f51048b495653f4de87354e7bba1a4d4d4b951e79" gracePeriod=30 Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.207460 4721 generic.go:334] "Generic (PLEG): container finished" podID="25a832ca-eec3-4483-b12a-4bc922d51326" containerID="816a45936f6803b12e0ea97f51048b495653f4de87354e7bba1a4d4d4b951e79" exitCode=0 Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.207678 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" event={"ID":"25a832ca-eec3-4483-b12a-4bc922d51326","Type":"ContainerDied","Data":"816a45936f6803b12e0ea97f51048b495653f4de87354e7bba1a4d4d4b951e79"} Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.328007 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.515406 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-config\") pod \"25a832ca-eec3-4483-b12a-4bc922d51326\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.515570 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n98g\" (UniqueName: \"kubernetes.io/projected/25a832ca-eec3-4483-b12a-4bc922d51326-kube-api-access-2n98g\") pod \"25a832ca-eec3-4483-b12a-4bc922d51326\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.515600 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-proxy-ca-bundles\") pod \"25a832ca-eec3-4483-b12a-4bc922d51326\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.515647 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-client-ca\") pod \"25a832ca-eec3-4483-b12a-4bc922d51326\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.515675 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a832ca-eec3-4483-b12a-4bc922d51326-serving-cert\") pod \"25a832ca-eec3-4483-b12a-4bc922d51326\" (UID: \"25a832ca-eec3-4483-b12a-4bc922d51326\") " Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.517586 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "25a832ca-eec3-4483-b12a-4bc922d51326" (UID: "25a832ca-eec3-4483-b12a-4bc922d51326"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.517666 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-config" (OuterVolumeSpecName: "config") pod "25a832ca-eec3-4483-b12a-4bc922d51326" (UID: "25a832ca-eec3-4483-b12a-4bc922d51326"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.517677 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-client-ca" (OuterVolumeSpecName: "client-ca") pod "25a832ca-eec3-4483-b12a-4bc922d51326" (UID: "25a832ca-eec3-4483-b12a-4bc922d51326"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.521396 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a832ca-eec3-4483-b12a-4bc922d51326-kube-api-access-2n98g" (OuterVolumeSpecName: "kube-api-access-2n98g") pod "25a832ca-eec3-4483-b12a-4bc922d51326" (UID: "25a832ca-eec3-4483-b12a-4bc922d51326"). InnerVolumeSpecName "kube-api-access-2n98g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.521434 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a832ca-eec3-4483-b12a-4bc922d51326-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25a832ca-eec3-4483-b12a-4bc922d51326" (UID: "25a832ca-eec3-4483-b12a-4bc922d51326"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.617795 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n98g\" (UniqueName: \"kubernetes.io/projected/25a832ca-eec3-4483-b12a-4bc922d51326-kube-api-access-2n98g\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.617838 4721 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.617848 4721 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.617857 4721 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a832ca-eec3-4483-b12a-4bc922d51326-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:33 crc kubenswrapper[4721]: I0202 13:06:33.617865 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25a832ca-eec3-4483-b12a-4bc922d51326-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.215142 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" event={"ID":"25a832ca-eec3-4483-b12a-4bc922d51326","Type":"ContainerDied","Data":"12aa00937c8b0761da5e1b6892ac57c359e37da61e443d450319dba4f1cacec8"} Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.215203 4721 scope.go:117] "RemoveContainer" containerID="816a45936f6803b12e0ea97f51048b495653f4de87354e7bba1a4d4d4b951e79" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.215215 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77b768d98c-wq8j4" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.242231 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-wq8j4"] Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.246591 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77b768d98c-wq8j4"] Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.416739 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a832ca-eec3-4483-b12a-4bc922d51326" path="/var/lib/kubelet/pods/25a832ca-eec3-4483-b12a-4bc922d51326/volumes" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.477373 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5574d8cf7-t7w7b"] Feb 02 13:06:34 crc kubenswrapper[4721]: E0202 13:06:34.477626 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a832ca-eec3-4483-b12a-4bc922d51326" containerName="controller-manager" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.477641 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a832ca-eec3-4483-b12a-4bc922d51326" containerName="controller-manager" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.477760 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a832ca-eec3-4483-b12a-4bc922d51326" containerName="controller-manager" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.478213 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.480901 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.481014 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.482412 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.482632 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.482777 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.482982 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.493001 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.494974 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5574d8cf7-t7w7b"] Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.531923 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6266a42-238a-497e-ba45-2994385106f8-serving-cert\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.531980 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6266a42-238a-497e-ba45-2994385106f8-proxy-ca-bundles\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.532026 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6266a42-238a-497e-ba45-2994385106f8-client-ca\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.532061 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6266a42-238a-497e-ba45-2994385106f8-config\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.532213 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfgmv\" (UniqueName: \"kubernetes.io/projected/e6266a42-238a-497e-ba45-2994385106f8-kube-api-access-cfgmv\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.633662 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6266a42-238a-497e-ba45-2994385106f8-serving-cert\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.633719 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6266a42-238a-497e-ba45-2994385106f8-proxy-ca-bundles\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.633754 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6266a42-238a-497e-ba45-2994385106f8-client-ca\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.633783 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6266a42-238a-497e-ba45-2994385106f8-config\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.633799 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfgmv\" (UniqueName: \"kubernetes.io/projected/e6266a42-238a-497e-ba45-2994385106f8-kube-api-access-cfgmv\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.635167 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e6266a42-238a-497e-ba45-2994385106f8-client-ca\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.635360 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e6266a42-238a-497e-ba45-2994385106f8-proxy-ca-bundles\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.635632 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6266a42-238a-497e-ba45-2994385106f8-config\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.644120 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6266a42-238a-497e-ba45-2994385106f8-serving-cert\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.650117 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfgmv\" (UniqueName: \"kubernetes.io/projected/e6266a42-238a-497e-ba45-2994385106f8-kube-api-access-cfgmv\") pod \"controller-manager-5574d8cf7-t7w7b\" (UID: \"e6266a42-238a-497e-ba45-2994385106f8\") " pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:34 crc kubenswrapper[4721]: I0202 13:06:34.794922 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:35 crc kubenswrapper[4721]: I0202 13:06:35.256702 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5574d8cf7-t7w7b"] Feb 02 13:06:35 crc kubenswrapper[4721]: W0202 13:06:35.259389 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6266a42_238a_497e_ba45_2994385106f8.slice/crio-590353e2a8b6fe748bdf645c184eaf1162cf3eca687c3a54fb0c9e29035596fd WatchSource:0}: Error finding container 590353e2a8b6fe748bdf645c184eaf1162cf3eca687c3a54fb0c9e29035596fd: Status 404 returned error can't find the container with id 590353e2a8b6fe748bdf645c184eaf1162cf3eca687c3a54fb0c9e29035596fd Feb 02 13:06:36 crc kubenswrapper[4721]: I0202 13:06:36.230107 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" event={"ID":"e6266a42-238a-497e-ba45-2994385106f8","Type":"ContainerStarted","Data":"ddab1107d92641a945d2a1cecc58ed8908eb0412dd2134e2a415a3ab9dac8b54"} Feb 02 13:06:36 crc kubenswrapper[4721]: I0202 13:06:36.230486 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" event={"ID":"e6266a42-238a-497e-ba45-2994385106f8","Type":"ContainerStarted","Data":"590353e2a8b6fe748bdf645c184eaf1162cf3eca687c3a54fb0c9e29035596fd"} Feb 02 13:06:36 crc kubenswrapper[4721]: I0202 13:06:36.230509 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:36 crc kubenswrapper[4721]: I0202 13:06:36.236770 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" Feb 02 13:06:36 crc kubenswrapper[4721]: I0202 13:06:36.248244 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5574d8cf7-t7w7b" podStartSLOduration=4.248222593 podStartE2EDuration="4.248222593s" podCreationTimestamp="2026-02-02 13:06:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:06:36.248174041 +0000 UTC m=+336.550688440" watchObservedRunningTime="2026-02-02 13:06:36.248222593 +0000 UTC m=+336.550736982" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.719938 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-n5k57"] Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.721327 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.732133 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-n5k57"] Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.822601 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgff4\" (UniqueName: \"kubernetes.io/projected/5628b062-be01-4627-aec5-247e0de021e7-kube-api-access-tgff4\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.822675 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5628b062-be01-4627-aec5-247e0de021e7-trusted-ca\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.822719 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5628b062-be01-4627-aec5-247e0de021e7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.822749 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5628b062-be01-4627-aec5-247e0de021e7-bound-sa-token\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.822781 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.822806 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5628b062-be01-4627-aec5-247e0de021e7-registry-certificates\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.822921 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5628b062-be01-4627-aec5-247e0de021e7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.822959 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5628b062-be01-4627-aec5-247e0de021e7-registry-tls\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.847617 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.924137 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5628b062-be01-4627-aec5-247e0de021e7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.924212 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5628b062-be01-4627-aec5-247e0de021e7-registry-tls\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.924245 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgff4\" (UniqueName: \"kubernetes.io/projected/5628b062-be01-4627-aec5-247e0de021e7-kube-api-access-tgff4\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.924283 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5628b062-be01-4627-aec5-247e0de021e7-trusted-ca\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.924325 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5628b062-be01-4627-aec5-247e0de021e7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.924359 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5628b062-be01-4627-aec5-247e0de021e7-bound-sa-token\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.924382 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5628b062-be01-4627-aec5-247e0de021e7-registry-certificates\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.925033 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5628b062-be01-4627-aec5-247e0de021e7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.925954 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5628b062-be01-4627-aec5-247e0de021e7-trusted-ca\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.926056 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5628b062-be01-4627-aec5-247e0de021e7-registry-certificates\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.929666 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5628b062-be01-4627-aec5-247e0de021e7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.929693 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5628b062-be01-4627-aec5-247e0de021e7-registry-tls\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.943148 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgff4\" (UniqueName: \"kubernetes.io/projected/5628b062-be01-4627-aec5-247e0de021e7-kube-api-access-tgff4\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:47 crc kubenswrapper[4721]: I0202 13:06:47.946955 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5628b062-be01-4627-aec5-247e0de021e7-bound-sa-token\") pod \"image-registry-66df7c8f76-n5k57\" (UID: \"5628b062-be01-4627-aec5-247e0de021e7\") " pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:48 crc kubenswrapper[4721]: I0202 13:06:48.047697 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:48 crc kubenswrapper[4721]: I0202 13:06:48.464361 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-n5k57"] Feb 02 13:06:48 crc kubenswrapper[4721]: W0202 13:06:48.470122 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5628b062_be01_4627_aec5_247e0de021e7.slice/crio-5f212530e3225ba18d9ece1c0311befb7bd5e169d57efacc921a917e4380b790 WatchSource:0}: Error finding container 5f212530e3225ba18d9ece1c0311befb7bd5e169d57efacc921a917e4380b790: Status 404 returned error can't find the container with id 5f212530e3225ba18d9ece1c0311befb7bd5e169d57efacc921a917e4380b790 Feb 02 13:06:49 crc kubenswrapper[4721]: I0202 13:06:49.301705 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" event={"ID":"5628b062-be01-4627-aec5-247e0de021e7","Type":"ContainerStarted","Data":"fac662d7e1da20f0d1cda3f9a98cb1097dd68dfa04e8e668547a49ffebd04c7d"} Feb 02 13:06:49 crc kubenswrapper[4721]: I0202 13:06:49.302039 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:06:49 crc kubenswrapper[4721]: I0202 13:06:49.302052 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" event={"ID":"5628b062-be01-4627-aec5-247e0de021e7","Type":"ContainerStarted","Data":"5f212530e3225ba18d9ece1c0311befb7bd5e169d57efacc921a917e4380b790"} Feb 02 13:06:49 crc kubenswrapper[4721]: I0202 13:06:49.323965 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" podStartSLOduration=2.323945819 podStartE2EDuration="2.323945819s" podCreationTimestamp="2026-02-02 13:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:06:49.318054376 +0000 UTC m=+349.620568765" watchObservedRunningTime="2026-02-02 13:06:49.323945819 +0000 UTC m=+349.626460218" Feb 02 13:07:08 crc kubenswrapper[4721]: I0202 13:07:08.054824 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-n5k57" Feb 02 13:07:08 crc kubenswrapper[4721]: I0202 13:07:08.104630 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wlhhk"] Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.794378 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2tcj"] Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.795260 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s2tcj" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" containerName="registry-server" containerID="cri-o://53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970" gracePeriod=30 Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.807475 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ftf6s"] Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.807727 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ftf6s" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerName="registry-server" containerID="cri-o://410edca835d21d18be323b803ab840df97552992c0ec9f405104a39dad3828e0" gracePeriod=30 Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.812994 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zcf44"] Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.813289 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerName="marketplace-operator" containerID="cri-o://598f0ede0e5be4e9da5dde9217b5303de54f203ff19e55b638f8757dabfd9366" gracePeriod=30 Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.823770 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-75gx6"] Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.824048 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-75gx6" podUID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerName="registry-server" containerID="cri-o://7fb035f08da78b4db85bc1ea0be2acf6c348cb105b55a68b2cac0ebf26b8bc1a" gracePeriod=30 Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.831794 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pm5t7"] Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.839692 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pm5t7" podUID="b97707af-edd5-4907-9459-615b32a005e6" containerName="registry-server" containerID="cri-o://33ab21456de0805136a88c1d6ecd74979aabd0d3bfbd8cc958f6e35c18ed050d" gracePeriod=30 Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.850750 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wdnhz"] Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.852222 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.857835 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wdnhz"] Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.863326 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884fbbc4-b86d-4f88-9fc6-2aa2015b81d3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wdnhz\" (UID: \"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.863378 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhzwx\" (UniqueName: \"kubernetes.io/projected/884fbbc4-b86d-4f88-9fc6-2aa2015b81d3-kube-api-access-lhzwx\") pod \"marketplace-operator-79b997595-wdnhz\" (UID: \"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.863405 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/884fbbc4-b86d-4f88-9fc6-2aa2015b81d3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wdnhz\" (UID: \"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.965138 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884fbbc4-b86d-4f88-9fc6-2aa2015b81d3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wdnhz\" (UID: \"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.965185 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhzwx\" (UniqueName: \"kubernetes.io/projected/884fbbc4-b86d-4f88-9fc6-2aa2015b81d3-kube-api-access-lhzwx\") pod \"marketplace-operator-79b997595-wdnhz\" (UID: \"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.965208 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/884fbbc4-b86d-4f88-9fc6-2aa2015b81d3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wdnhz\" (UID: \"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.968760 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/884fbbc4-b86d-4f88-9fc6-2aa2015b81d3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wdnhz\" (UID: \"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.972844 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/884fbbc4-b86d-4f88-9fc6-2aa2015b81d3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wdnhz\" (UID: \"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:10 crc kubenswrapper[4721]: I0202 13:07:10.984213 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhzwx\" (UniqueName: \"kubernetes.io/projected/884fbbc4-b86d-4f88-9fc6-2aa2015b81d3-kube-api-access-lhzwx\") pod \"marketplace-operator-79b997595-wdnhz\" (UID: \"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3\") " pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.252851 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.262792 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.269283 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgmqw\" (UniqueName: \"kubernetes.io/projected/c32d34a1-8dd8-435d-9491-748392c25b97-kube-api-access-bgmqw\") pod \"c32d34a1-8dd8-435d-9491-748392c25b97\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.269374 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-utilities\") pod \"c32d34a1-8dd8-435d-9491-748392c25b97\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.269456 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-catalog-content\") pod \"c32d34a1-8dd8-435d-9491-748392c25b97\" (UID: \"c32d34a1-8dd8-435d-9491-748392c25b97\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.270994 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-utilities" (OuterVolumeSpecName: "utilities") pod "c32d34a1-8dd8-435d-9491-748392c25b97" (UID: "c32d34a1-8dd8-435d-9491-748392c25b97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.277324 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32d34a1-8dd8-435d-9491-748392c25b97-kube-api-access-bgmqw" (OuterVolumeSpecName: "kube-api-access-bgmqw") pod "c32d34a1-8dd8-435d-9491-748392c25b97" (UID: "c32d34a1-8dd8-435d-9491-748392c25b97"). InnerVolumeSpecName "kube-api-access-bgmqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.343829 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c32d34a1-8dd8-435d-9491-748392c25b97" (UID: "c32d34a1-8dd8-435d-9491-748392c25b97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.370559 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.370602 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgmqw\" (UniqueName: \"kubernetes.io/projected/c32d34a1-8dd8-435d-9491-748392c25b97-kube-api-access-bgmqw\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.370619 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c32d34a1-8dd8-435d-9491-748392c25b97-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.452185 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.457543 4721 generic.go:334] "Generic (PLEG): container finished" podID="b97707af-edd5-4907-9459-615b32a005e6" containerID="33ab21456de0805136a88c1d6ecd74979aabd0d3bfbd8cc958f6e35c18ed050d" exitCode=0 Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.457610 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm5t7" event={"ID":"b97707af-edd5-4907-9459-615b32a005e6","Type":"ContainerDied","Data":"33ab21456de0805136a88c1d6ecd74979aabd0d3bfbd8cc958f6e35c18ed050d"} Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.459994 4721 generic.go:334] "Generic (PLEG): container finished" podID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerID="598f0ede0e5be4e9da5dde9217b5303de54f203ff19e55b638f8757dabfd9366" exitCode=0 Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.460042 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" event={"ID":"2c9074bc-889d-4ce7-a250-6fc5984703e0","Type":"ContainerDied","Data":"598f0ede0e5be4e9da5dde9217b5303de54f203ff19e55b638f8757dabfd9366"} Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.460173 4721 scope.go:117] "RemoveContainer" containerID="598f0ede0e5be4e9da5dde9217b5303de54f203ff19e55b638f8757dabfd9366" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.460333 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zcf44" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.470350 4721 generic.go:334] "Generic (PLEG): container finished" podID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerID="7fb035f08da78b4db85bc1ea0be2acf6c348cb105b55a68b2cac0ebf26b8bc1a" exitCode=0 Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.476377 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75gx6" event={"ID":"b11b9dcc-682e-48c6-9948-78aafcaf9e36","Type":"ContainerDied","Data":"7fb035f08da78b4db85bc1ea0be2acf6c348cb105b55a68b2cac0ebf26b8bc1a"} Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.490766 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.491123 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s2tcj" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.491018 4721 generic.go:334] "Generic (PLEG): container finished" podID="c32d34a1-8dd8-435d-9491-748392c25b97" containerID="53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970" exitCode=0 Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.491042 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2tcj" event={"ID":"c32d34a1-8dd8-435d-9491-748392c25b97","Type":"ContainerDied","Data":"53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970"} Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.492415 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s2tcj" event={"ID":"c32d34a1-8dd8-435d-9491-748392c25b97","Type":"ContainerDied","Data":"607b84235315a7323834248f190f48d1f32164d2db07ef1dcc79f0ce6457a6d0"} Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.496309 4721 generic.go:334] "Generic (PLEG): container finished" podID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerID="410edca835d21d18be323b803ab840df97552992c0ec9f405104a39dad3828e0" exitCode=0 Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.496356 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftf6s" event={"ID":"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b","Type":"ContainerDied","Data":"410edca835d21d18be323b803ab840df97552992c0ec9f405104a39dad3828e0"} Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.496459 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.499192 4721 scope.go:117] "RemoveContainer" containerID="6e95f003df211d09b9562e86431541c3b7c3e84c41d01ea470d07b5cb914180b" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.502487 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.521299 4721 scope.go:117] "RemoveContainer" containerID="53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.541922 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s2tcj"] Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.542308 4721 scope.go:117] "RemoveContainer" containerID="cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.558631 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s2tcj"] Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.572923 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-trusted-ca\") pod \"2c9074bc-889d-4ce7-a250-6fc5984703e0\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.572968 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-operator-metrics\") pod \"2c9074bc-889d-4ce7-a250-6fc5984703e0\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.573032 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn92f\" (UniqueName: \"kubernetes.io/projected/2c9074bc-889d-4ce7-a250-6fc5984703e0-kube-api-access-xn92f\") pod \"2c9074bc-889d-4ce7-a250-6fc5984703e0\" (UID: \"2c9074bc-889d-4ce7-a250-6fc5984703e0\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.574290 4721 scope.go:117] "RemoveContainer" containerID="70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.577251 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "2c9074bc-889d-4ce7-a250-6fc5984703e0" (UID: "2c9074bc-889d-4ce7-a250-6fc5984703e0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.577696 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "2c9074bc-889d-4ce7-a250-6fc5984703e0" (UID: "2c9074bc-889d-4ce7-a250-6fc5984703e0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.579785 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c9074bc-889d-4ce7-a250-6fc5984703e0-kube-api-access-xn92f" (OuterVolumeSpecName: "kube-api-access-xn92f") pod "2c9074bc-889d-4ce7-a250-6fc5984703e0" (UID: "2c9074bc-889d-4ce7-a250-6fc5984703e0"). InnerVolumeSpecName "kube-api-access-xn92f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.587559 4721 scope.go:117] "RemoveContainer" containerID="53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970" Feb 02 13:07:11 crc kubenswrapper[4721]: E0202 13:07:11.587858 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970\": container with ID starting with 53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970 not found: ID does not exist" containerID="53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.587990 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970"} err="failed to get container status \"53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970\": rpc error: code = NotFound desc = could not find container \"53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970\": container with ID starting with 53d55fa7bab62269a87c795054de164d19a14b295c5f255ec1a3087c58689970 not found: ID does not exist" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.588014 4721 scope.go:117] "RemoveContainer" containerID="cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76" Feb 02 13:07:11 crc kubenswrapper[4721]: E0202 13:07:11.588360 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76\": container with ID starting with cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76 not found: ID does not exist" containerID="cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.588386 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76"} err="failed to get container status \"cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76\": rpc error: code = NotFound desc = could not find container \"cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76\": container with ID starting with cb844a088120bc9509ad8ea45c170bf127b70cd4144302a7ad6ac587245cde76 not found: ID does not exist" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.588421 4721 scope.go:117] "RemoveContainer" containerID="70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d" Feb 02 13:07:11 crc kubenswrapper[4721]: E0202 13:07:11.588679 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d\": container with ID starting with 70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d not found: ID does not exist" containerID="70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.588732 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d"} err="failed to get container status \"70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d\": rpc error: code = NotFound desc = could not find container \"70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d\": container with ID starting with 70aefbce2f033e0a1ab3507eb5f25cc0466994c0e699fb1e5db5f4596c72d39d not found: ID does not exist" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674355 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-catalog-content\") pod \"b97707af-edd5-4907-9459-615b32a005e6\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674399 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srrj7\" (UniqueName: \"kubernetes.io/projected/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-kube-api-access-srrj7\") pod \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674457 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsbmd\" (UniqueName: \"kubernetes.io/projected/b11b9dcc-682e-48c6-9948-78aafcaf9e36-kube-api-access-gsbmd\") pod \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674484 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-utilities\") pod \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674527 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-utilities\") pod \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674551 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-utilities\") pod \"b97707af-edd5-4907-9459-615b32a005e6\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674573 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-catalog-content\") pod \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\" (UID: \"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674601 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-catalog-content\") pod \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\" (UID: \"b11b9dcc-682e-48c6-9948-78aafcaf9e36\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674621 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk875\" (UniqueName: \"kubernetes.io/projected/b97707af-edd5-4907-9459-615b32a005e6-kube-api-access-mk875\") pod \"b97707af-edd5-4907-9459-615b32a005e6\" (UID: \"b97707af-edd5-4907-9459-615b32a005e6\") " Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674807 4721 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674823 4721 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2c9074bc-889d-4ce7-a250-6fc5984703e0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.674835 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn92f\" (UniqueName: \"kubernetes.io/projected/2c9074bc-889d-4ce7-a250-6fc5984703e0-kube-api-access-xn92f\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.675724 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-utilities" (OuterVolumeSpecName: "utilities") pod "7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" (UID: "7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.676031 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-utilities" (OuterVolumeSpecName: "utilities") pod "b11b9dcc-682e-48c6-9948-78aafcaf9e36" (UID: "b11b9dcc-682e-48c6-9948-78aafcaf9e36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.676589 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-utilities" (OuterVolumeSpecName: "utilities") pod "b97707af-edd5-4907-9459-615b32a005e6" (UID: "b97707af-edd5-4907-9459-615b32a005e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.677168 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11b9dcc-682e-48c6-9948-78aafcaf9e36-kube-api-access-gsbmd" (OuterVolumeSpecName: "kube-api-access-gsbmd") pod "b11b9dcc-682e-48c6-9948-78aafcaf9e36" (UID: "b11b9dcc-682e-48c6-9948-78aafcaf9e36"). InnerVolumeSpecName "kube-api-access-gsbmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.680420 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97707af-edd5-4907-9459-615b32a005e6-kube-api-access-mk875" (OuterVolumeSpecName: "kube-api-access-mk875") pod "b97707af-edd5-4907-9459-615b32a005e6" (UID: "b97707af-edd5-4907-9459-615b32a005e6"). InnerVolumeSpecName "kube-api-access-mk875". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.686797 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-kube-api-access-srrj7" (OuterVolumeSpecName: "kube-api-access-srrj7") pod "7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" (UID: "7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b"). InnerVolumeSpecName "kube-api-access-srrj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.703712 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11b9dcc-682e-48c6-9948-78aafcaf9e36" (UID: "b11b9dcc-682e-48c6-9948-78aafcaf9e36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.733908 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" (UID: "7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.779272 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.779306 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.779320 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.779333 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11b9dcc-682e-48c6-9948-78aafcaf9e36-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.779348 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk875\" (UniqueName: \"kubernetes.io/projected/b97707af-edd5-4907-9459-615b32a005e6-kube-api-access-mk875\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.779362 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srrj7\" (UniqueName: \"kubernetes.io/projected/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-kube-api-access-srrj7\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.779375 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsbmd\" (UniqueName: \"kubernetes.io/projected/b11b9dcc-682e-48c6-9948-78aafcaf9e36-kube-api-access-gsbmd\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.779387 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.800658 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wdnhz"] Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.805735 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zcf44"] Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.811981 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b97707af-edd5-4907-9459-615b32a005e6" (UID: "b97707af-edd5-4907-9459-615b32a005e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.823262 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zcf44"] Feb 02 13:07:11 crc kubenswrapper[4721]: I0202 13:07:11.880223 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b97707af-edd5-4907-9459-615b32a005e6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.424753 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" path="/var/lib/kubelet/pods/2c9074bc-889d-4ce7-a250-6fc5984703e0/volumes" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.425749 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" path="/var/lib/kubelet/pods/c32d34a1-8dd8-435d-9491-748392c25b97/volumes" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.503951 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ftf6s" event={"ID":"7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b","Type":"ContainerDied","Data":"ff138d00e2dec0f6fe53dd62f78ed24adffd461fe550704795a81bdea55a7066"} Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.503979 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ftf6s" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.504056 4721 scope.go:117] "RemoveContainer" containerID="410edca835d21d18be323b803ab840df97552992c0ec9f405104a39dad3828e0" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.505594 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pm5t7" event={"ID":"b97707af-edd5-4907-9459-615b32a005e6","Type":"ContainerDied","Data":"731af821f39e85a65313814ab808bf2c6795132d15116e8c0a34a91225b2d5b6"} Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.505707 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pm5t7" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.513446 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75gx6" event={"ID":"b11b9dcc-682e-48c6-9948-78aafcaf9e36","Type":"ContainerDied","Data":"5628d04181a13e0213caa7a951b015bba8003374b2bb6f608199a4eba95c3b17"} Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.513503 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75gx6" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.517954 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" event={"ID":"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3","Type":"ContainerStarted","Data":"d10ae26e9bb0464a19801ffbe62c8af355edb1f673e6e65d0e5f00acc648b10b"} Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.518038 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" event={"ID":"884fbbc4-b86d-4f88-9fc6-2aa2015b81d3","Type":"ContainerStarted","Data":"e5a898e7a9533cf3d6dd33e89749a46f724661ae5084c8e13dd0fdb9f012eca3"} Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.518343 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.523878 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.532095 4721 scope.go:117] "RemoveContainer" containerID="9828d087ebd0a3d61cf032280a26f38ac894879bfffd8ab8dc7b9c9e262b96fd" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.536360 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pm5t7"] Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.538205 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pm5t7"] Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.554589 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ftf6s"] Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.563133 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ftf6s"] Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.564083 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-75gx6"] Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.565827 4721 scope.go:117] "RemoveContainer" containerID="c4262d6d2388653a81a9cb645a74803128568514778cf935a45ab36a4268cbc6" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.566861 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-75gx6"] Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.581857 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wdnhz" podStartSLOduration=2.581840407 podStartE2EDuration="2.581840407s" podCreationTimestamp="2026-02-02 13:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:07:12.579368349 +0000 UTC m=+372.881882758" watchObservedRunningTime="2026-02-02 13:07:12.581840407 +0000 UTC m=+372.884354796" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.589343 4721 scope.go:117] "RemoveContainer" containerID="33ab21456de0805136a88c1d6ecd74979aabd0d3bfbd8cc958f6e35c18ed050d" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.619000 4721 scope.go:117] "RemoveContainer" containerID="0d83a50e35994a069f84ac2fbbbdd424065961585a6b4d7fa1391779a81dfd2b" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.642046 4721 scope.go:117] "RemoveContainer" containerID="41599e7535f02f311fe8e5965707307ae8f5502aec8ceadc6ba6ac29d4504579" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.660879 4721 scope.go:117] "RemoveContainer" containerID="7fb035f08da78b4db85bc1ea0be2acf6c348cb105b55a68b2cac0ebf26b8bc1a" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.686342 4721 scope.go:117] "RemoveContainer" containerID="e1bc2f4e0704a9ccfc7c0081b3e11299c3f9620d4305892b40a44031af50c66b" Feb 02 13:07:12 crc kubenswrapper[4721]: I0202 13:07:12.705343 4721 scope.go:117] "RemoveContainer" containerID="d98e8c3180feeb272dbc337ede325b3ee8bdf7c11b2445546d5a7351f1d071c3" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019256 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c64xc"] Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019528 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" containerName="extract-utilities" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019540 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" containerName="extract-utilities" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019550 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019556 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019568 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97707af-edd5-4907-9459-615b32a005e6" containerName="extract-utilities" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019574 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97707af-edd5-4907-9459-615b32a005e6" containerName="extract-utilities" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019584 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019589 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019596 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019601 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019614 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerName="extract-utilities" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019620 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerName="extract-utilities" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019630 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97707af-edd5-4907-9459-615b32a005e6" containerName="extract-content" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019636 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97707af-edd5-4907-9459-615b32a005e6" containerName="extract-content" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019643 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerName="marketplace-operator" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019649 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerName="marketplace-operator" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019660 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerName="extract-utilities" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019665 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerName="extract-utilities" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019673 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerName="marketplace-operator" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019680 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerName="marketplace-operator" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019691 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerName="extract-content" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019697 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerName="extract-content" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019704 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerName="extract-content" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019709 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerName="extract-content" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019718 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" containerName="extract-content" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019723 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" containerName="extract-content" Feb 02 13:07:13 crc kubenswrapper[4721]: E0202 13:07:13.019732 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97707af-edd5-4907-9459-615b32a005e6" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019737 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97707af-edd5-4907-9459-615b32a005e6" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019821 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019829 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019841 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32d34a1-8dd8-435d-9491-748392c25b97" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019847 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerName="marketplace-operator" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019857 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c9074bc-889d-4ce7-a250-6fc5984703e0" containerName="marketplace-operator" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.019863 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97707af-edd5-4907-9459-615b32a005e6" containerName="registry-server" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.020590 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.022333 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.023277 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c64xc"] Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.209169 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8drnj\" (UniqueName: \"kubernetes.io/projected/8851d4c5-8c20-440c-bb07-d7542ea1620d-kube-api-access-8drnj\") pod \"redhat-marketplace-c64xc\" (UID: \"8851d4c5-8c20-440c-bb07-d7542ea1620d\") " pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.209278 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8851d4c5-8c20-440c-bb07-d7542ea1620d-utilities\") pod \"redhat-marketplace-c64xc\" (UID: \"8851d4c5-8c20-440c-bb07-d7542ea1620d\") " pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.209313 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8851d4c5-8c20-440c-bb07-d7542ea1620d-catalog-content\") pod \"redhat-marketplace-c64xc\" (UID: \"8851d4c5-8c20-440c-bb07-d7542ea1620d\") " pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.213173 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gc4db"] Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.215054 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.219525 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.221538 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gc4db"] Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.310359 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8851d4c5-8c20-440c-bb07-d7542ea1620d-utilities\") pod \"redhat-marketplace-c64xc\" (UID: \"8851d4c5-8c20-440c-bb07-d7542ea1620d\") " pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.310414 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8851d4c5-8c20-440c-bb07-d7542ea1620d-catalog-content\") pod \"redhat-marketplace-c64xc\" (UID: \"8851d4c5-8c20-440c-bb07-d7542ea1620d\") " pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.310734 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8drnj\" (UniqueName: \"kubernetes.io/projected/8851d4c5-8c20-440c-bb07-d7542ea1620d-kube-api-access-8drnj\") pod \"redhat-marketplace-c64xc\" (UID: \"8851d4c5-8c20-440c-bb07-d7542ea1620d\") " pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.310853 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8851d4c5-8c20-440c-bb07-d7542ea1620d-catalog-content\") pod \"redhat-marketplace-c64xc\" (UID: \"8851d4c5-8c20-440c-bb07-d7542ea1620d\") " pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.311401 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8851d4c5-8c20-440c-bb07-d7542ea1620d-utilities\") pod \"redhat-marketplace-c64xc\" (UID: \"8851d4c5-8c20-440c-bb07-d7542ea1620d\") " pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.338835 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8drnj\" (UniqueName: \"kubernetes.io/projected/8851d4c5-8c20-440c-bb07-d7542ea1620d-kube-api-access-8drnj\") pod \"redhat-marketplace-c64xc\" (UID: \"8851d4c5-8c20-440c-bb07-d7542ea1620d\") " pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.344301 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.412274 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-utilities\") pod \"redhat-operators-gc4db\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.412563 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-catalog-content\") pod \"redhat-operators-gc4db\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.412679 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9grc\" (UniqueName: \"kubernetes.io/projected/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-kube-api-access-f9grc\") pod \"redhat-operators-gc4db\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.513461 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-utilities\") pod \"redhat-operators-gc4db\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.513528 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-catalog-content\") pod \"redhat-operators-gc4db\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.513591 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9grc\" (UniqueName: \"kubernetes.io/projected/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-kube-api-access-f9grc\") pod \"redhat-operators-gc4db\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.513987 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-utilities\") pod \"redhat-operators-gc4db\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.514243 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-catalog-content\") pod \"redhat-operators-gc4db\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.537205 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9grc\" (UniqueName: \"kubernetes.io/projected/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-kube-api-access-f9grc\") pod \"redhat-operators-gc4db\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.547489 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.754497 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c64xc"] Feb 02 13:07:13 crc kubenswrapper[4721]: W0202 13:07:13.766497 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8851d4c5_8c20_440c_bb07_d7542ea1620d.slice/crio-96be71087000d8a85cbf561c42377e0d2351a76a62b024a0cafc13962b0aa12f WatchSource:0}: Error finding container 96be71087000d8a85cbf561c42377e0d2351a76a62b024a0cafc13962b0aa12f: Status 404 returned error can't find the container with id 96be71087000d8a85cbf561c42377e0d2351a76a62b024a0cafc13962b0aa12f Feb 02 13:07:13 crc kubenswrapper[4721]: I0202 13:07:13.954933 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gc4db"] Feb 02 13:07:14 crc kubenswrapper[4721]: W0202 13:07:14.023857 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d4d1a7c_52fd_456d_ab0e_78a9c4529fd1.slice/crio-c443084e3e1b254cd6cca1fcfdbe64c90be36a3dfa2b67b63b03d3820015e610 WatchSource:0}: Error finding container c443084e3e1b254cd6cca1fcfdbe64c90be36a3dfa2b67b63b03d3820015e610: Status 404 returned error can't find the container with id c443084e3e1b254cd6cca1fcfdbe64c90be36a3dfa2b67b63b03d3820015e610 Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.417639 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b" path="/var/lib/kubelet/pods/7d9a2b55-8601-4fdc-9e0a-1cfe62d3754b/volumes" Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.419050 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11b9dcc-682e-48c6-9948-78aafcaf9e36" path="/var/lib/kubelet/pods/b11b9dcc-682e-48c6-9948-78aafcaf9e36/volumes" Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.420141 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b97707af-edd5-4907-9459-615b32a005e6" path="/var/lib/kubelet/pods/b97707af-edd5-4907-9459-615b32a005e6/volumes" Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.534856 4721 generic.go:334] "Generic (PLEG): container finished" podID="8851d4c5-8c20-440c-bb07-d7542ea1620d" containerID="589fb384ac7240dcc9c807d4c1ce3907384769a460707e2d0ea0f1f488fe20b1" exitCode=0 Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.534903 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c64xc" event={"ID":"8851d4c5-8c20-440c-bb07-d7542ea1620d","Type":"ContainerDied","Data":"589fb384ac7240dcc9c807d4c1ce3907384769a460707e2d0ea0f1f488fe20b1"} Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.534925 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c64xc" event={"ID":"8851d4c5-8c20-440c-bb07-d7542ea1620d","Type":"ContainerStarted","Data":"96be71087000d8a85cbf561c42377e0d2351a76a62b024a0cafc13962b0aa12f"} Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.538532 4721 generic.go:334] "Generic (PLEG): container finished" podID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerID="b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f" exitCode=0 Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.538962 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gc4db" event={"ID":"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1","Type":"ContainerDied","Data":"b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f"} Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.539034 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gc4db" event={"ID":"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1","Type":"ContainerStarted","Data":"c443084e3e1b254cd6cca1fcfdbe64c90be36a3dfa2b67b63b03d3820015e610"} Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.764083 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:07:14 crc kubenswrapper[4721]: I0202 13:07:14.764153 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.406920 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w5wlg"] Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.408398 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.417473 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.421403 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5wlg"] Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.444463 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-utilities\") pod \"certified-operators-w5wlg\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.444527 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9j2g\" (UniqueName: \"kubernetes.io/projected/2db39b59-16bf-4029-b8be-4be395b09cdf-kube-api-access-c9j2g\") pod \"certified-operators-w5wlg\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.444568 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-catalog-content\") pod \"certified-operators-w5wlg\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.546737 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-catalog-content\") pod \"certified-operators-w5wlg\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.547162 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-utilities\") pod \"certified-operators-w5wlg\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.547206 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9j2g\" (UniqueName: \"kubernetes.io/projected/2db39b59-16bf-4029-b8be-4be395b09cdf-kube-api-access-c9j2g\") pod \"certified-operators-w5wlg\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.547307 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-catalog-content\") pod \"certified-operators-w5wlg\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.547522 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-utilities\") pod \"certified-operators-w5wlg\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.577402 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9j2g\" (UniqueName: \"kubernetes.io/projected/2db39b59-16bf-4029-b8be-4be395b09cdf-kube-api-access-c9j2g\") pod \"certified-operators-w5wlg\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.619143 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kv46m"] Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.621090 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.626085 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kv46m"] Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.626666 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.648218 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9ad0b8-eef7-451f-82b9-1b5cc54c63c2-catalog-content\") pod \"community-operators-kv46m\" (UID: \"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2\") " pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.648282 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5svst\" (UniqueName: \"kubernetes.io/projected/be9ad0b8-eef7-451f-82b9-1b5cc54c63c2-kube-api-access-5svst\") pod \"community-operators-kv46m\" (UID: \"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2\") " pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.648316 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9ad0b8-eef7-451f-82b9-1b5cc54c63c2-utilities\") pod \"community-operators-kv46m\" (UID: \"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2\") " pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.736196 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.749590 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5svst\" (UniqueName: \"kubernetes.io/projected/be9ad0b8-eef7-451f-82b9-1b5cc54c63c2-kube-api-access-5svst\") pod \"community-operators-kv46m\" (UID: \"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2\") " pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.749662 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9ad0b8-eef7-451f-82b9-1b5cc54c63c2-utilities\") pod \"community-operators-kv46m\" (UID: \"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2\") " pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.749726 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9ad0b8-eef7-451f-82b9-1b5cc54c63c2-catalog-content\") pod \"community-operators-kv46m\" (UID: \"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2\") " pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.750316 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9ad0b8-eef7-451f-82b9-1b5cc54c63c2-catalog-content\") pod \"community-operators-kv46m\" (UID: \"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2\") " pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.750427 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9ad0b8-eef7-451f-82b9-1b5cc54c63c2-utilities\") pod \"community-operators-kv46m\" (UID: \"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2\") " pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.769804 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5svst\" (UniqueName: \"kubernetes.io/projected/be9ad0b8-eef7-451f-82b9-1b5cc54c63c2-kube-api-access-5svst\") pod \"community-operators-kv46m\" (UID: \"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2\") " pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:15 crc kubenswrapper[4721]: I0202 13:07:15.966804 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:16 crc kubenswrapper[4721]: I0202 13:07:16.420909 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kv46m"] Feb 02 13:07:16 crc kubenswrapper[4721]: I0202 13:07:16.471582 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5wlg"] Feb 02 13:07:16 crc kubenswrapper[4721]: W0202 13:07:16.481612 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db39b59_16bf_4029_b8be_4be395b09cdf.slice/crio-aa53f668db48e2aa112a5cbfcc3d2601a92bffa612a49b3105e8618823c15e6b WatchSource:0}: Error finding container aa53f668db48e2aa112a5cbfcc3d2601a92bffa612a49b3105e8618823c15e6b: Status 404 returned error can't find the container with id aa53f668db48e2aa112a5cbfcc3d2601a92bffa612a49b3105e8618823c15e6b Feb 02 13:07:16 crc kubenswrapper[4721]: I0202 13:07:16.551009 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kv46m" event={"ID":"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2","Type":"ContainerStarted","Data":"13b55abfcd9ca12983bfa4cac3819ce020504592eb47d09b42cdb49038a429be"} Feb 02 13:07:16 crc kubenswrapper[4721]: I0202 13:07:16.552242 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5wlg" event={"ID":"2db39b59-16bf-4029-b8be-4be395b09cdf","Type":"ContainerStarted","Data":"aa53f668db48e2aa112a5cbfcc3d2601a92bffa612a49b3105e8618823c15e6b"} Feb 02 13:07:16 crc kubenswrapper[4721]: I0202 13:07:16.555641 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gc4db" event={"ID":"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1","Type":"ContainerStarted","Data":"cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6"} Feb 02 13:07:16 crc kubenswrapper[4721]: I0202 13:07:16.559231 4721 generic.go:334] "Generic (PLEG): container finished" podID="8851d4c5-8c20-440c-bb07-d7542ea1620d" containerID="b921fdc1a1548d0c5260ec96ba224ac1d42c622bf506f78cd51ecd903122aa38" exitCode=0 Feb 02 13:07:16 crc kubenswrapper[4721]: I0202 13:07:16.559271 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c64xc" event={"ID":"8851d4c5-8c20-440c-bb07-d7542ea1620d","Type":"ContainerDied","Data":"b921fdc1a1548d0c5260ec96ba224ac1d42c622bf506f78cd51ecd903122aa38"} Feb 02 13:07:17 crc kubenswrapper[4721]: I0202 13:07:17.566676 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c64xc" event={"ID":"8851d4c5-8c20-440c-bb07-d7542ea1620d","Type":"ContainerStarted","Data":"b883177debaaa82956b438155412f9e795a9ef3cca98eeb4a50b4e9a65b484f7"} Feb 02 13:07:17 crc kubenswrapper[4721]: I0202 13:07:17.569849 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kv46m" event={"ID":"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2","Type":"ContainerDied","Data":"27d629500b0dd1abf484f3a89bad9c432c4ba1d67d2523e09a86d87aff25f1c0"} Feb 02 13:07:17 crc kubenswrapper[4721]: I0202 13:07:17.569707 4721 generic.go:334] "Generic (PLEG): container finished" podID="be9ad0b8-eef7-451f-82b9-1b5cc54c63c2" containerID="27d629500b0dd1abf484f3a89bad9c432c4ba1d67d2523e09a86d87aff25f1c0" exitCode=0 Feb 02 13:07:17 crc kubenswrapper[4721]: I0202 13:07:17.573967 4721 generic.go:334] "Generic (PLEG): container finished" podID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerID="cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc" exitCode=0 Feb 02 13:07:17 crc kubenswrapper[4721]: I0202 13:07:17.574023 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5wlg" event={"ID":"2db39b59-16bf-4029-b8be-4be395b09cdf","Type":"ContainerDied","Data":"cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc"} Feb 02 13:07:17 crc kubenswrapper[4721]: I0202 13:07:17.576876 4721 generic.go:334] "Generic (PLEG): container finished" podID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerID="cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6" exitCode=0 Feb 02 13:07:17 crc kubenswrapper[4721]: I0202 13:07:17.576898 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gc4db" event={"ID":"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1","Type":"ContainerDied","Data":"cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6"} Feb 02 13:07:17 crc kubenswrapper[4721]: I0202 13:07:17.614739 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c64xc" podStartSLOduration=2.959321091 podStartE2EDuration="5.614715549s" podCreationTimestamp="2026-02-02 13:07:12 +0000 UTC" firstStartedPulling="2026-02-02 13:07:14.536426484 +0000 UTC m=+374.838940873" lastFinishedPulling="2026-02-02 13:07:17.191820932 +0000 UTC m=+377.494335331" observedRunningTime="2026-02-02 13:07:17.593337556 +0000 UTC m=+377.895851965" watchObservedRunningTime="2026-02-02 13:07:17.614715549 +0000 UTC m=+377.917229948" Feb 02 13:07:19 crc kubenswrapper[4721]: I0202 13:07:19.589160 4721 generic.go:334] "Generic (PLEG): container finished" podID="be9ad0b8-eef7-451f-82b9-1b5cc54c63c2" containerID="b13058d7a73183fa9d3319bc03fdee0b0f77b3a7ccd547efe16bc18ab7a6b684" exitCode=0 Feb 02 13:07:19 crc kubenswrapper[4721]: I0202 13:07:19.589218 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kv46m" event={"ID":"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2","Type":"ContainerDied","Data":"b13058d7a73183fa9d3319bc03fdee0b0f77b3a7ccd547efe16bc18ab7a6b684"} Feb 02 13:07:19 crc kubenswrapper[4721]: I0202 13:07:19.594184 4721 generic.go:334] "Generic (PLEG): container finished" podID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerID="6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f" exitCode=0 Feb 02 13:07:19 crc kubenswrapper[4721]: I0202 13:07:19.594331 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5wlg" event={"ID":"2db39b59-16bf-4029-b8be-4be395b09cdf","Type":"ContainerDied","Data":"6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f"} Feb 02 13:07:19 crc kubenswrapper[4721]: I0202 13:07:19.600137 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gc4db" event={"ID":"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1","Type":"ContainerStarted","Data":"f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a"} Feb 02 13:07:19 crc kubenswrapper[4721]: I0202 13:07:19.636168 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gc4db" podStartSLOduration=2.688735166 podStartE2EDuration="6.636152451s" podCreationTimestamp="2026-02-02 13:07:13 +0000 UTC" firstStartedPulling="2026-02-02 13:07:14.539830819 +0000 UTC m=+374.842345208" lastFinishedPulling="2026-02-02 13:07:18.487248104 +0000 UTC m=+378.789762493" observedRunningTime="2026-02-02 13:07:19.63503319 +0000 UTC m=+379.937547589" watchObservedRunningTime="2026-02-02 13:07:19.636152451 +0000 UTC m=+379.938666840" Feb 02 13:07:20 crc kubenswrapper[4721]: I0202 13:07:20.608199 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kv46m" event={"ID":"be9ad0b8-eef7-451f-82b9-1b5cc54c63c2","Type":"ContainerStarted","Data":"aee779ee53582607f4f45862acaef80e8e030a57d723aa45959d7fc3dc54b957"} Feb 02 13:07:20 crc kubenswrapper[4721]: I0202 13:07:20.618889 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5wlg" event={"ID":"2db39b59-16bf-4029-b8be-4be395b09cdf","Type":"ContainerStarted","Data":"fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87"} Feb 02 13:07:20 crc kubenswrapper[4721]: I0202 13:07:20.642335 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kv46m" podStartSLOduration=3.254529645 podStartE2EDuration="5.642318255s" podCreationTimestamp="2026-02-02 13:07:15 +0000 UTC" firstStartedPulling="2026-02-02 13:07:17.570988745 +0000 UTC m=+377.873503134" lastFinishedPulling="2026-02-02 13:07:19.958777365 +0000 UTC m=+380.261291744" observedRunningTime="2026-02-02 13:07:20.636314779 +0000 UTC m=+380.938829168" watchObservedRunningTime="2026-02-02 13:07:20.642318255 +0000 UTC m=+380.944832664" Feb 02 13:07:20 crc kubenswrapper[4721]: I0202 13:07:20.657620 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w5wlg" podStartSLOduration=3.216043257 podStartE2EDuration="5.657583309s" podCreationTimestamp="2026-02-02 13:07:15 +0000 UTC" firstStartedPulling="2026-02-02 13:07:17.575914631 +0000 UTC m=+377.878429020" lastFinishedPulling="2026-02-02 13:07:20.017454683 +0000 UTC m=+380.319969072" observedRunningTime="2026-02-02 13:07:20.652129808 +0000 UTC m=+380.954644207" watchObservedRunningTime="2026-02-02 13:07:20.657583309 +0000 UTC m=+380.960097708" Feb 02 13:07:23 crc kubenswrapper[4721]: I0202 13:07:23.344874 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:23 crc kubenswrapper[4721]: I0202 13:07:23.345255 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:23 crc kubenswrapper[4721]: I0202 13:07:23.385978 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:23 crc kubenswrapper[4721]: I0202 13:07:23.548361 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:23 crc kubenswrapper[4721]: I0202 13:07:23.548428 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:23 crc kubenswrapper[4721]: I0202 13:07:23.704801 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c64xc" Feb 02 13:07:24 crc kubenswrapper[4721]: I0202 13:07:24.608192 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gc4db" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerName="registry-server" probeResult="failure" output=< Feb 02 13:07:24 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:07:24 crc kubenswrapper[4721]: > Feb 02 13:07:25 crc kubenswrapper[4721]: I0202 13:07:25.737755 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:25 crc kubenswrapper[4721]: I0202 13:07:25.738596 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:25 crc kubenswrapper[4721]: I0202 13:07:25.785375 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:25 crc kubenswrapper[4721]: I0202 13:07:25.967718 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:25 crc kubenswrapper[4721]: I0202 13:07:25.969099 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:26 crc kubenswrapper[4721]: I0202 13:07:26.005444 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:26 crc kubenswrapper[4721]: I0202 13:07:26.682805 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:07:26 crc kubenswrapper[4721]: I0202 13:07:26.683418 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kv46m" Feb 02 13:07:33 crc kubenswrapper[4721]: I0202 13:07:33.144305 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" podUID="e5f7f80a-15ef-47b9-9e1e-325066df7897" containerName="registry" containerID="cri-o://0bd3bc5f864672ee0d853f714bcf1118de69c8c71858f5b65ce239a92ed34811" gracePeriod=30 Feb 02 13:07:33 crc kubenswrapper[4721]: I0202 13:07:33.590019 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:33 crc kubenswrapper[4721]: I0202 13:07:33.629253 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:07:33 crc kubenswrapper[4721]: I0202 13:07:33.711033 4721 generic.go:334] "Generic (PLEG): container finished" podID="e5f7f80a-15ef-47b9-9e1e-325066df7897" containerID="0bd3bc5f864672ee0d853f714bcf1118de69c8c71858f5b65ce239a92ed34811" exitCode=0 Feb 02 13:07:33 crc kubenswrapper[4721]: I0202 13:07:33.711114 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" event={"ID":"e5f7f80a-15ef-47b9-9e1e-325066df7897","Type":"ContainerDied","Data":"0bd3bc5f864672ee0d853f714bcf1118de69c8c71858f5b65ce239a92ed34811"} Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.071978 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.210470 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5f7f80a-15ef-47b9-9e1e-325066df7897-ca-trust-extracted\") pod \"e5f7f80a-15ef-47b9-9e1e-325066df7897\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.210578 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-trusted-ca\") pod \"e5f7f80a-15ef-47b9-9e1e-325066df7897\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.210614 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwm4n\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-kube-api-access-rwm4n\") pod \"e5f7f80a-15ef-47b9-9e1e-325066df7897\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.210650 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5f7f80a-15ef-47b9-9e1e-325066df7897-installation-pull-secrets\") pod \"e5f7f80a-15ef-47b9-9e1e-325066df7897\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.210704 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-certificates\") pod \"e5f7f80a-15ef-47b9-9e1e-325066df7897\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.210861 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e5f7f80a-15ef-47b9-9e1e-325066df7897\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.210889 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-tls\") pod \"e5f7f80a-15ef-47b9-9e1e-325066df7897\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.210931 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-bound-sa-token\") pod \"e5f7f80a-15ef-47b9-9e1e-325066df7897\" (UID: \"e5f7f80a-15ef-47b9-9e1e-325066df7897\") " Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.211766 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e5f7f80a-15ef-47b9-9e1e-325066df7897" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.211789 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e5f7f80a-15ef-47b9-9e1e-325066df7897" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.212614 4721 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.212653 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5f7f80a-15ef-47b9-9e1e-325066df7897-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.217191 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-kube-api-access-rwm4n" (OuterVolumeSpecName: "kube-api-access-rwm4n") pod "e5f7f80a-15ef-47b9-9e1e-325066df7897" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897"). InnerVolumeSpecName "kube-api-access-rwm4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.217991 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e5f7f80a-15ef-47b9-9e1e-325066df7897" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.217513 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f7f80a-15ef-47b9-9e1e-325066df7897-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e5f7f80a-15ef-47b9-9e1e-325066df7897" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.219305 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e5f7f80a-15ef-47b9-9e1e-325066df7897" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.236053 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f7f80a-15ef-47b9-9e1e-325066df7897-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e5f7f80a-15ef-47b9-9e1e-325066df7897" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.237949 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e5f7f80a-15ef-47b9-9e1e-325066df7897" (UID: "e5f7f80a-15ef-47b9-9e1e-325066df7897"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.313596 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwm4n\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-kube-api-access-rwm4n\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.313638 4721 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e5f7f80a-15ef-47b9-9e1e-325066df7897-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.313652 4721 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.313664 4721 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5f7f80a-15ef-47b9-9e1e-325066df7897-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.313674 4721 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e5f7f80a-15ef-47b9-9e1e-325066df7897-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.717609 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" event={"ID":"e5f7f80a-15ef-47b9-9e1e-325066df7897","Type":"ContainerDied","Data":"05e1b0050534ad29187ccb842c7d704d41289dc2c02dfe3c8fae4b1bff20a647"} Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.718061 4721 scope.go:117] "RemoveContainer" containerID="0bd3bc5f864672ee0d853f714bcf1118de69c8c71858f5b65ce239a92ed34811" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.717891 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-wlhhk" Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.734040 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wlhhk"] Feb 02 13:07:34 crc kubenswrapper[4721]: I0202 13:07:34.738521 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-wlhhk"] Feb 02 13:07:36 crc kubenswrapper[4721]: I0202 13:07:36.416477 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f7f80a-15ef-47b9-9e1e-325066df7897" path="/var/lib/kubelet/pods/e5f7f80a-15ef-47b9-9e1e-325066df7897/volumes" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.957227 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg"] Feb 02 13:07:42 crc kubenswrapper[4721]: E0202 13:07:42.957462 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f7f80a-15ef-47b9-9e1e-325066df7897" containerName="registry" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.957473 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f7f80a-15ef-47b9-9e1e-325066df7897" containerName="registry" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.957558 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f7f80a-15ef-47b9-9e1e-325066df7897" containerName="registry" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.957898 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.961051 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.961207 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.961259 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.964997 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.965425 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Feb 02 13:07:42 crc kubenswrapper[4721]: I0202 13:07:42.971478 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg"] Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.130472 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-298jq\" (UniqueName: \"kubernetes.io/projected/5de3bc6e-7b95-472f-9f28-84414fa8e54f-kube-api-access-298jq\") pod \"cluster-monitoring-operator-6d5b84845-z4lcg\" (UID: \"5de3bc6e-7b95-472f-9f28-84414fa8e54f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.130515 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5de3bc6e-7b95-472f-9f28-84414fa8e54f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-z4lcg\" (UID: \"5de3bc6e-7b95-472f-9f28-84414fa8e54f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.130571 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5de3bc6e-7b95-472f-9f28-84414fa8e54f-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-z4lcg\" (UID: \"5de3bc6e-7b95-472f-9f28-84414fa8e54f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.231485 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5de3bc6e-7b95-472f-9f28-84414fa8e54f-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-z4lcg\" (UID: \"5de3bc6e-7b95-472f-9f28-84414fa8e54f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.231557 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-298jq\" (UniqueName: \"kubernetes.io/projected/5de3bc6e-7b95-472f-9f28-84414fa8e54f-kube-api-access-298jq\") pod \"cluster-monitoring-operator-6d5b84845-z4lcg\" (UID: \"5de3bc6e-7b95-472f-9f28-84414fa8e54f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.231582 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5de3bc6e-7b95-472f-9f28-84414fa8e54f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-z4lcg\" (UID: \"5de3bc6e-7b95-472f-9f28-84414fa8e54f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.232658 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5de3bc6e-7b95-472f-9f28-84414fa8e54f-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-z4lcg\" (UID: \"5de3bc6e-7b95-472f-9f28-84414fa8e54f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.239212 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5de3bc6e-7b95-472f-9f28-84414fa8e54f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-z4lcg\" (UID: \"5de3bc6e-7b95-472f-9f28-84414fa8e54f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.254018 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-298jq\" (UniqueName: \"kubernetes.io/projected/5de3bc6e-7b95-472f-9f28-84414fa8e54f-kube-api-access-298jq\") pod \"cluster-monitoring-operator-6d5b84845-z4lcg\" (UID: \"5de3bc6e-7b95-472f-9f28-84414fa8e54f\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.277648 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.728060 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg"] Feb 02 13:07:43 crc kubenswrapper[4721]: I0202 13:07:43.761569 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" event={"ID":"5de3bc6e-7b95-472f-9f28-84414fa8e54f","Type":"ContainerStarted","Data":"dafcff591e88db5db9959a8fb275dae4293f8e5f2446448eb2a2c1af2443abc5"} Feb 02 13:07:44 crc kubenswrapper[4721]: I0202 13:07:44.763694 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:07:44 crc kubenswrapper[4721]: I0202 13:07:44.764662 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:07:46 crc kubenswrapper[4721]: I0202 13:07:46.781451 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" event={"ID":"5de3bc6e-7b95-472f-9f28-84414fa8e54f","Type":"ContainerStarted","Data":"0cc61598d53f44af4ac9dd5a6d6d9feb6b77c30d6bfacc5275df7e8e95f7394b"} Feb 02 13:07:46 crc kubenswrapper[4721]: I0202 13:07:46.796658 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-z4lcg" podStartSLOduration=2.212937993 podStartE2EDuration="4.796634878s" podCreationTimestamp="2026-02-02 13:07:42 +0000 UTC" firstStartedPulling="2026-02-02 13:07:43.745616142 +0000 UTC m=+404.048130531" lastFinishedPulling="2026-02-02 13:07:46.329313027 +0000 UTC m=+406.631827416" observedRunningTime="2026-02-02 13:07:46.793297216 +0000 UTC m=+407.095811605" watchObservedRunningTime="2026-02-02 13:07:46.796634878 +0000 UTC m=+407.099149257" Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.083399 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm"] Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.084298 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.088249 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-8k4jz" Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.088600 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.099604 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm"] Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.195428 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a960c725-d05f-4af4-bf9b-aee9a8e8ffbe-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dh4qm\" (UID: \"a960c725-d05f-4af4-bf9b-aee9a8e8ffbe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.296376 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a960c725-d05f-4af4-bf9b-aee9a8e8ffbe-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dh4qm\" (UID: \"a960c725-d05f-4af4-bf9b-aee9a8e8ffbe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.302914 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a960c725-d05f-4af4-bf9b-aee9a8e8ffbe-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-dh4qm\" (UID: \"a960c725-d05f-4af4-bf9b-aee9a8e8ffbe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.401658 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" Feb 02 13:07:47 crc kubenswrapper[4721]: I0202 13:07:47.801128 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm"] Feb 02 13:07:48 crc kubenswrapper[4721]: I0202 13:07:48.797125 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" event={"ID":"a960c725-d05f-4af4-bf9b-aee9a8e8ffbe","Type":"ContainerStarted","Data":"4931697aae06df67eeb7ddf8ecdac0170e6c1491c1d06241b0757ab36570eff4"} Feb 02 13:07:51 crc kubenswrapper[4721]: I0202 13:07:51.823843 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" event={"ID":"a960c725-d05f-4af4-bf9b-aee9a8e8ffbe","Type":"ContainerStarted","Data":"e30334e8e451cfe04ca398502eac04e5a2354cfaf54fcdef7ee593b7844b0b85"} Feb 02 13:07:51 crc kubenswrapper[4721]: I0202 13:07:51.824243 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" Feb 02 13:07:51 crc kubenswrapper[4721]: I0202 13:07:51.829908 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" Feb 02 13:07:51 crc kubenswrapper[4721]: I0202 13:07:51.861671 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-dh4qm" podStartSLOduration=1.882668263 podStartE2EDuration="4.861646495s" podCreationTimestamp="2026-02-02 13:07:47 +0000 UTC" firstStartedPulling="2026-02-02 13:07:47.806681307 +0000 UTC m=+408.109195686" lastFinishedPulling="2026-02-02 13:07:50.785659529 +0000 UTC m=+411.088173918" observedRunningTime="2026-02-02 13:07:51.844471629 +0000 UTC m=+412.146986018" watchObservedRunningTime="2026-02-02 13:07:51.861646495 +0000 UTC m=+412.164160904" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.139341 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-j2fmj"] Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.140391 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.142363 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.142534 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-8js9g" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.142782 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.143740 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.149653 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-j2fmj"] Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.263344 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.263420 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.263482 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-metrics-client-ca\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.263507 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzb2\" (UniqueName: \"kubernetes.io/projected/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-kube-api-access-tnzb2\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.365279 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-metrics-client-ca\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.365337 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzb2\" (UniqueName: \"kubernetes.io/projected/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-kube-api-access-tnzb2\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.365391 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.365442 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.367191 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-metrics-client-ca\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.374368 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.376125 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.385584 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzb2\" (UniqueName: \"kubernetes.io/projected/c45de1fe-3902-4bc0-8cf6-0b3312c92c9e-kube-api-access-tnzb2\") pod \"prometheus-operator-db54df47d-j2fmj\" (UID: \"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e\") " pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.458599 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" Feb 02 13:07:52 crc kubenswrapper[4721]: I0202 13:07:52.844962 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-j2fmj"] Feb 02 13:07:52 crc kubenswrapper[4721]: W0202 13:07:52.850622 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc45de1fe_3902_4bc0_8cf6_0b3312c92c9e.slice/crio-ed6e64b27ff7a41568c005a31411f40b742531151ee0f85fdaf390e9ed43b35f WatchSource:0}: Error finding container ed6e64b27ff7a41568c005a31411f40b742531151ee0f85fdaf390e9ed43b35f: Status 404 returned error can't find the container with id ed6e64b27ff7a41568c005a31411f40b742531151ee0f85fdaf390e9ed43b35f Feb 02 13:07:53 crc kubenswrapper[4721]: I0202 13:07:53.850712 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" event={"ID":"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e","Type":"ContainerStarted","Data":"ed6e64b27ff7a41568c005a31411f40b742531151ee0f85fdaf390e9ed43b35f"} Feb 02 13:07:57 crc kubenswrapper[4721]: I0202 13:07:57.958138 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" event={"ID":"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e","Type":"ContainerStarted","Data":"f1444fdd436c5b8f212389e79bb38d2955fc2e11a97aa2a8f54c6548f061ecc1"} Feb 02 13:07:57 crc kubenswrapper[4721]: I0202 13:07:57.958732 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" event={"ID":"c45de1fe-3902-4bc0-8cf6-0b3312c92c9e","Type":"ContainerStarted","Data":"99f8574cf722fb3b223e8376e11c47089ae97133d665376b3b8176d028548f9e"} Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.522020 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-j2fmj" podStartSLOduration=4.970323901 podStartE2EDuration="7.522005129s" podCreationTimestamp="2026-02-02 13:07:52 +0000 UTC" firstStartedPulling="2026-02-02 13:07:52.853253494 +0000 UTC m=+413.155767883" lastFinishedPulling="2026-02-02 13:07:55.404934712 +0000 UTC m=+415.707449111" observedRunningTime="2026-02-02 13:07:57.975540135 +0000 UTC m=+418.278054524" watchObservedRunningTime="2026-02-02 13:07:59.522005129 +0000 UTC m=+419.824519518" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.523432 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-kfp42"] Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.524326 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.528247 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.529339 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.529388 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-xn7ll" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.558880 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-kfp42"] Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.561728 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t76sw\" (UniqueName: \"kubernetes.io/projected/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-kube-api-access-t76sw\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.561802 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.561832 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.561873 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.611977 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk"] Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.613113 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.615986 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.618468 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-jgvh5" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.618623 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.621150 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.638288 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk"] Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.643563 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tdqc9"] Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.644847 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.646669 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.646977 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-v45hk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.648915 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663186 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t76sw\" (UniqueName: \"kubernetes.io/projected/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-kube-api-access-t76sw\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663228 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-sys\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663253 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-root\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663280 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/91030360-19d6-44e9-b7a3-4662fd652353-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663303 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663325 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663351 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663371 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99gz5\" (UniqueName: \"kubernetes.io/projected/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-kube-api-access-99gz5\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663401 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/91030360-19d6-44e9-b7a3-4662fd652353-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663426 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-textfile\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663495 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-wtmp\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.663628 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/91030360-19d6-44e9-b7a3-4662fd652353-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.664484 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.664587 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.664646 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/91030360-19d6-44e9-b7a3-4662fd652353-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.664989 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91030360-19d6-44e9-b7a3-4662fd652353-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.665057 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-tls\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.665102 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-metrics-client-ca\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.665157 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n994k\" (UniqueName: \"kubernetes.io/projected/91030360-19d6-44e9-b7a3-4662fd652353-kube-api-access-n994k\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.672702 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.673056 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.683806 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t76sw\" (UniqueName: \"kubernetes.io/projected/ba42cfe6-f3df-45f0-ab80-4781ce41b9a8-kube-api-access-t76sw\") pod \"openshift-state-metrics-566fddb674-kfp42\" (UID: \"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.766537 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-tls\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.766596 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-metrics-client-ca\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: E0202 13:07:59.766731 4721 secret.go:188] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Feb 02 13:07:59 crc kubenswrapper[4721]: E0202 13:07:59.766810 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-tls podName:b2e082e0-4057-4d2e-bcb3-dc5286f0f705 nodeName:}" failed. No retries permitted until 2026-02-02 13:08:00.266788257 +0000 UTC m=+420.569302646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-tls") pod "node-exporter-tdqc9" (UID: "b2e082e0-4057-4d2e-bcb3-dc5286f0f705") : secret "node-exporter-tls" not found Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.767510 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-metrics-client-ca\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.767572 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n994k\" (UniqueName: \"kubernetes.io/projected/91030360-19d6-44e9-b7a3-4662fd652353-kube-api-access-n994k\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.767633 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-sys\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.767715 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-sys\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.767754 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-root\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.767785 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/91030360-19d6-44e9-b7a3-4662fd652353-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.767839 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.767853 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-root\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.768669 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/91030360-19d6-44e9-b7a3-4662fd652353-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.768818 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99gz5\" (UniqueName: \"kubernetes.io/projected/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-kube-api-access-99gz5\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.769234 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/91030360-19d6-44e9-b7a3-4662fd652353-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.769287 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-textfile\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.769314 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-wtmp\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.769331 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/91030360-19d6-44e9-b7a3-4662fd652353-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.769386 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/91030360-19d6-44e9-b7a3-4662fd652353-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.769412 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/91030360-19d6-44e9-b7a3-4662fd652353-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.769953 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91030360-19d6-44e9-b7a3-4662fd652353-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.769806 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-textfile\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.769548 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-wtmp\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.770676 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/91030360-19d6-44e9-b7a3-4662fd652353-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.773686 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.774355 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/91030360-19d6-44e9-b7a3-4662fd652353-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.777605 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/91030360-19d6-44e9-b7a3-4662fd652353-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.786872 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n994k\" (UniqueName: \"kubernetes.io/projected/91030360-19d6-44e9-b7a3-4662fd652353-kube-api-access-n994k\") pod \"kube-state-metrics-777cb5bd5d-trwbk\" (UID: \"91030360-19d6-44e9-b7a3-4662fd652353\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.793997 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99gz5\" (UniqueName: \"kubernetes.io/projected/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-kube-api-access-99gz5\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.841116 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" Feb 02 13:07:59 crc kubenswrapper[4721]: I0202 13:07:59.928649 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.235350 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-kfp42"] Feb 02 13:08:00 crc kubenswrapper[4721]: W0202 13:08:00.245499 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba42cfe6_f3df_45f0_ab80_4781ce41b9a8.slice/crio-c6366f9b29d099585dc8ac4669e3ea8513fed89fc254a90ae06281b0dd22e2c6 WatchSource:0}: Error finding container c6366f9b29d099585dc8ac4669e3ea8513fed89fc254a90ae06281b0dd22e2c6: Status 404 returned error can't find the container with id c6366f9b29d099585dc8ac4669e3ea8513fed89fc254a90ae06281b0dd22e2c6 Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.275603 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-tls\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.281308 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b2e082e0-4057-4d2e-bcb3-dc5286f0f705-node-exporter-tls\") pod \"node-exporter-tdqc9\" (UID: \"b2e082e0-4057-4d2e-bcb3-dc5286f0f705\") " pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.454452 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk"] Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.558956 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-v45hk" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.567627 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tdqc9" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.633933 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.635949 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.646016 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.646959 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.646959 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-pn5xs" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.647725 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.648034 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.648200 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.648378 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.648542 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.657224 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.676872 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.689759 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/457655c2-194f-4643-804e-1024580bb2dc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690185 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690211 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690229 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/457655c2-194f-4643-804e-1024580bb2dc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690254 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/457655c2-194f-4643-804e-1024580bb2dc-config-out\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690272 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/457655c2-194f-4643-804e-1024580bb2dc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690290 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-config-volume\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690308 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690330 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-web-config\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690349 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/457655c2-194f-4643-804e-1024580bb2dc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690432 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbgcw\" (UniqueName: \"kubernetes.io/projected/457655c2-194f-4643-804e-1024580bb2dc-kube-api-access-rbgcw\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.690494 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791496 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791546 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791577 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/457655c2-194f-4643-804e-1024580bb2dc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791609 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/457655c2-194f-4643-804e-1024580bb2dc-config-out\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791629 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/457655c2-194f-4643-804e-1024580bb2dc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791649 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-config-volume\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791671 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791696 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-web-config\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791721 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/457655c2-194f-4643-804e-1024580bb2dc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791745 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbgcw\" (UniqueName: \"kubernetes.io/projected/457655c2-194f-4643-804e-1024580bb2dc-kube-api-access-rbgcw\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791770 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.791805 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/457655c2-194f-4643-804e-1024580bb2dc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.792182 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/457655c2-194f-4643-804e-1024580bb2dc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.792824 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/457655c2-194f-4643-804e-1024580bb2dc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.793048 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/457655c2-194f-4643-804e-1024580bb2dc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.796856 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-config-volume\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.796915 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.797711 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.799088 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-web-config\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.799399 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.808521 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbgcw\" (UniqueName: \"kubernetes.io/projected/457655c2-194f-4643-804e-1024580bb2dc-kube-api-access-rbgcw\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.811563 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/457655c2-194f-4643-804e-1024580bb2dc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.811562 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/457655c2-194f-4643-804e-1024580bb2dc-config-out\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.811672 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/457655c2-194f-4643-804e-1024580bb2dc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"457655c2-194f-4643-804e-1024580bb2dc\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.963383 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.990126 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" event={"ID":"91030360-19d6-44e9-b7a3-4662fd652353","Type":"ContainerStarted","Data":"3b8cbc931059e834864157637562a5feee03a982e922586a7e59170d1d27518d"} Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.992832 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tdqc9" event={"ID":"b2e082e0-4057-4d2e-bcb3-dc5286f0f705","Type":"ContainerStarted","Data":"0928c5a0d605f5581fd8312d1bfddd598456ab35fce910457efbe9de99d72eaa"} Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.995952 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" event={"ID":"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8","Type":"ContainerStarted","Data":"1b7bf1966c3b128bd1483a149044887fe3a950e5d3fd087b17fb31775c9aac80"} Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.996006 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" event={"ID":"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8","Type":"ContainerStarted","Data":"ac1a62757b7b19989fc42bc889c03f7a6ad0d02af641d6914d8d0df3947cb103"} Feb 02 13:08:00 crc kubenswrapper[4721]: I0202 13:08:00.996022 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" event={"ID":"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8","Type":"ContainerStarted","Data":"c6366f9b29d099585dc8ac4669e3ea8513fed89fc254a90ae06281b0dd22e2c6"} Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.356749 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 02 13:08:01 crc kubenswrapper[4721]: W0202 13:08:01.365000 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod457655c2_194f_4643_804e_1024580bb2dc.slice/crio-c9c55688364b12f13c1729955131cdddc7e10c6fbe6221be9d4e4748c5d571d9 WatchSource:0}: Error finding container c9c55688364b12f13c1729955131cdddc7e10c6fbe6221be9d4e4748c5d571d9: Status 404 returned error can't find the container with id c9c55688364b12f13c1729955131cdddc7e10c6fbe6221be9d4e4748c5d571d9 Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.483171 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-c45b586c-w7sw7"] Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.485053 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.489668 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.489705 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.489859 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.489896 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-qhqzb" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.489946 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.489976 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-94jvo0rjqi4tt" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.491103 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.492771 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c45b586c-w7sw7"] Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.499753 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-metrics-client-ca\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.499809 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.499871 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.499894 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.499947 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.499976 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-grpc-tls\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.500006 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vh7p\" (UniqueName: \"kubernetes.io/projected/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-kube-api-access-4vh7p\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.500035 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-tls\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.601755 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.604670 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.604744 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-grpc-tls\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.605443 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vh7p\" (UniqueName: \"kubernetes.io/projected/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-kube-api-access-4vh7p\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.605508 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-tls\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.605548 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-metrics-client-ca\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.605604 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.606360 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-metrics-client-ca\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.607161 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.609680 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.610113 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.610387 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-grpc-tls\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.610537 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-tls\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.611177 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.613429 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.626483 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vh7p\" (UniqueName: \"kubernetes.io/projected/8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6-kube-api-access-4vh7p\") pod \"thanos-querier-c45b586c-w7sw7\" (UID: \"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6\") " pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:01 crc kubenswrapper[4721]: I0202 13:08:01.811796 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:02 crc kubenswrapper[4721]: I0202 13:08:02.007313 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"457655c2-194f-4643-804e-1024580bb2dc","Type":"ContainerStarted","Data":"c9c55688364b12f13c1729955131cdddc7e10c6fbe6221be9d4e4748c5d571d9"} Feb 02 13:08:02 crc kubenswrapper[4721]: I0202 13:08:02.995720 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c45b586c-w7sw7"] Feb 02 13:08:03 crc kubenswrapper[4721]: I0202 13:08:03.020775 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tdqc9" event={"ID":"b2e082e0-4057-4d2e-bcb3-dc5286f0f705","Type":"ContainerStarted","Data":"a97cf8c7cb8dac71f8d7d6343b9b5bec8afc00f050646bb4233eda1ef42fc649"} Feb 02 13:08:03 crc kubenswrapper[4721]: I0202 13:08:03.025165 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" event={"ID":"ba42cfe6-f3df-45f0-ab80-4781ce41b9a8","Type":"ContainerStarted","Data":"c85ea78c3369ef1e922e98597dddfe15d49464d701a13c80f730b8e2a9a14b09"} Feb 02 13:08:03 crc kubenswrapper[4721]: I0202 13:08:03.027345 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" event={"ID":"91030360-19d6-44e9-b7a3-4662fd652353","Type":"ContainerStarted","Data":"57c494a76b8cf0bedc571ecbbb7f33e403a2afdd15b518dff1bf230652468193"} Feb 02 13:08:03 crc kubenswrapper[4721]: I0202 13:08:03.061594 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-kfp42" podStartSLOduration=2.146920806 podStartE2EDuration="4.061545814s" podCreationTimestamp="2026-02-02 13:07:59 +0000 UTC" firstStartedPulling="2026-02-02 13:08:00.667263114 +0000 UTC m=+420.969777503" lastFinishedPulling="2026-02-02 13:08:02.581888122 +0000 UTC m=+422.884402511" observedRunningTime="2026-02-02 13:08:03.056095813 +0000 UTC m=+423.358610212" watchObservedRunningTime="2026-02-02 13:08:03.061545814 +0000 UTC m=+423.364060233" Feb 02 13:08:03 crc kubenswrapper[4721]: W0202 13:08:03.293284 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e14e0ea_313a_415a_b59f_b3f1a6f1c7a6.slice/crio-94543e454ffd29ff1ae9db295669e38cb1c120486d362a006003075eb52744a2 WatchSource:0}: Error finding container 94543e454ffd29ff1ae9db295669e38cb1c120486d362a006003075eb52744a2: Status 404 returned error can't find the container with id 94543e454ffd29ff1ae9db295669e38cb1c120486d362a006003075eb52744a2 Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.040779 4721 generic.go:334] "Generic (PLEG): container finished" podID="b2e082e0-4057-4d2e-bcb3-dc5286f0f705" containerID="a97cf8c7cb8dac71f8d7d6343b9b5bec8afc00f050646bb4233eda1ef42fc649" exitCode=0 Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.041244 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tdqc9" event={"ID":"b2e082e0-4057-4d2e-bcb3-dc5286f0f705","Type":"ContainerDied","Data":"a97cf8c7cb8dac71f8d7d6343b9b5bec8afc00f050646bb4233eda1ef42fc649"} Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.042987 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" event={"ID":"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6","Type":"ContainerStarted","Data":"94543e454ffd29ff1ae9db295669e38cb1c120486d362a006003075eb52744a2"} Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.046690 4721 generic.go:334] "Generic (PLEG): container finished" podID="457655c2-194f-4643-804e-1024580bb2dc" containerID="1634746a7e0ac546432275ab3f0289d8acf1441a62c70fe647ce7766ffd4d59a" exitCode=0 Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.046890 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"457655c2-194f-4643-804e-1024580bb2dc","Type":"ContainerDied","Data":"1634746a7e0ac546432275ab3f0289d8acf1441a62c70fe647ce7766ffd4d59a"} Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.049635 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" event={"ID":"91030360-19d6-44e9-b7a3-4662fd652353","Type":"ContainerStarted","Data":"9d78edb12edaa4134f3d5f4ff0235d875316cbf6cdbec4bf5987343405d9776c"} Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.049675 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" event={"ID":"91030360-19d6-44e9-b7a3-4662fd652353","Type":"ContainerStarted","Data":"3f637eb1833ab57f426ced4edbe770701e8f1b08b20747c8420793731a2e0e06"} Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.112228 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-trwbk" podStartSLOduration=2.9374809859999997 podStartE2EDuration="5.112206228s" podCreationTimestamp="2026-02-02 13:07:59 +0000 UTC" firstStartedPulling="2026-02-02 13:08:00.45843594 +0000 UTC m=+420.760950329" lastFinishedPulling="2026-02-02 13:08:02.633161182 +0000 UTC m=+422.935675571" observedRunningTime="2026-02-02 13:08:04.108098404 +0000 UTC m=+424.410612803" watchObservedRunningTime="2026-02-02 13:08:04.112206228 +0000 UTC m=+424.414720617" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.275594 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-56c598d788-j9gjl"] Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.276815 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.303536 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56c598d788-j9gjl"] Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.372710 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-trusted-ca-bundle\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.372802 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-serving-cert\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.372825 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-console-config\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.372844 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-oauth-config\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.372878 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st584\" (UniqueName: \"kubernetes.io/projected/848aa930-7630-4eed-b114-23853a30daac-kube-api-access-st584\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.373031 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-oauth-serving-cert\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.373136 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-service-ca\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.474553 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-service-ca\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.474621 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-trusted-ca-bundle\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.474720 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-serving-cert\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.474741 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-console-config\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.474764 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-oauth-config\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.474791 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st584\" (UniqueName: \"kubernetes.io/projected/848aa930-7630-4eed-b114-23853a30daac-kube-api-access-st584\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.474830 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-oauth-serving-cert\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.475888 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-service-ca\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.476285 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-console-config\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.476321 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-oauth-serving-cert\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.477214 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-trusted-ca-bundle\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.484033 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-oauth-config\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.489232 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-serving-cert\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.501047 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st584\" (UniqueName: \"kubernetes.io/projected/848aa930-7630-4eed-b114-23853a30daac-kube-api-access-st584\") pod \"console-56c598d788-j9gjl\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.603997 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.818989 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-67df565f78-vmr8d"] Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.820529 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.823338 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.823358 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.823548 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.823587 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-cm94e6tf5h16h" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.823731 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-zh6bg" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.823871 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.872709 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67df565f78-vmr8d"] Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.881026 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtj2m\" (UniqueName: \"kubernetes.io/projected/4b390361-5a0f-423e-856c-dc0e11c32afa-kube-api-access-gtj2m\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.881108 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b390361-5a0f-423e-856c-dc0e11c32afa-client-ca-bundle\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.881129 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b390361-5a0f-423e-856c-dc0e11c32afa-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.881163 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4b390361-5a0f-423e-856c-dc0e11c32afa-audit-log\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.881187 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4b390361-5a0f-423e-856c-dc0e11c32afa-metrics-server-audit-profiles\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.881209 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4b390361-5a0f-423e-856c-dc0e11c32afa-secret-metrics-server-tls\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.881247 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4b390361-5a0f-423e-856c-dc0e11c32afa-secret-metrics-client-certs\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.982248 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4b390361-5a0f-423e-856c-dc0e11c32afa-audit-log\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.982319 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4b390361-5a0f-423e-856c-dc0e11c32afa-metrics-server-audit-profiles\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.982357 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4b390361-5a0f-423e-856c-dc0e11c32afa-secret-metrics-server-tls\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.982413 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4b390361-5a0f-423e-856c-dc0e11c32afa-secret-metrics-client-certs\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.982448 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtj2m\" (UniqueName: \"kubernetes.io/projected/4b390361-5a0f-423e-856c-dc0e11c32afa-kube-api-access-gtj2m\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.982484 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b390361-5a0f-423e-856c-dc0e11c32afa-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.982503 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b390361-5a0f-423e-856c-dc0e11c32afa-client-ca-bundle\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.982865 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4b390361-5a0f-423e-856c-dc0e11c32afa-audit-log\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.984961 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b390361-5a0f-423e-856c-dc0e11c32afa-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.985646 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4b390361-5a0f-423e-856c-dc0e11c32afa-metrics-server-audit-profiles\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.988481 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b390361-5a0f-423e-856c-dc0e11c32afa-client-ca-bundle\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.988553 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4b390361-5a0f-423e-856c-dc0e11c32afa-secret-metrics-client-certs\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:04 crc kubenswrapper[4721]: I0202 13:08:04.989009 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4b390361-5a0f-423e-856c-dc0e11c32afa-secret-metrics-server-tls\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.001508 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtj2m\" (UniqueName: \"kubernetes.io/projected/4b390361-5a0f-423e-856c-dc0e11c32afa-kube-api-access-gtj2m\") pod \"metrics-server-67df565f78-vmr8d\" (UID: \"4b390361-5a0f-423e-856c-dc0e11c32afa\") " pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.005756 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56c598d788-j9gjl"] Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.061298 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tdqc9" event={"ID":"b2e082e0-4057-4d2e-bcb3-dc5286f0f705","Type":"ContainerStarted","Data":"61132e4046d20b2beecf9834c1430c15fe9b1be51c120c430351662cc889f7a0"} Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.061349 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tdqc9" event={"ID":"b2e082e0-4057-4d2e-bcb3-dc5286f0f705","Type":"ContainerStarted","Data":"70b2191bafae7d0c557bb2d625421fdd4a5fddb420b7d6052056206a5ccc8038"} Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.090714 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tdqc9" podStartSLOduration=4.059647081 podStartE2EDuration="6.090694864s" podCreationTimestamp="2026-02-02 13:07:59 +0000 UTC" firstStartedPulling="2026-02-02 13:08:00.61333862 +0000 UTC m=+420.915853009" lastFinishedPulling="2026-02-02 13:08:02.644386403 +0000 UTC m=+422.946900792" observedRunningTime="2026-02-02 13:08:05.080584504 +0000 UTC m=+425.383098913" watchObservedRunningTime="2026-02-02 13:08:05.090694864 +0000 UTC m=+425.393209253" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.139414 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.280944 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7"] Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.283865 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.286744 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.288223 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.289002 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7"] Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.387789 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bf61d6b4-c01d-487e-bc9d-63d6b7654ce8-monitoring-plugin-cert\") pod \"monitoring-plugin-f7c547bd9-pxbl7\" (UID: \"bf61d6b4-c01d-487e-bc9d-63d6b7654ce8\") " pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.488815 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bf61d6b4-c01d-487e-bc9d-63d6b7654ce8-monitoring-plugin-cert\") pod \"monitoring-plugin-f7c547bd9-pxbl7\" (UID: \"bf61d6b4-c01d-487e-bc9d-63d6b7654ce8\") " pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.492625 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/bf61d6b4-c01d-487e-bc9d-63d6b7654ce8-monitoring-plugin-cert\") pod \"monitoring-plugin-f7c547bd9-pxbl7\" (UID: \"bf61d6b4-c01d-487e-bc9d-63d6b7654ce8\") " pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.607217 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" Feb 02 13:08:05 crc kubenswrapper[4721]: W0202 13:08:05.756872 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848aa930_7630_4eed_b114_23853a30daac.slice/crio-fcb6049175d618686bd24850fd66d3afd1a92f853e4d4ca21439dea1da41c941 WatchSource:0}: Error finding container fcb6049175d618686bd24850fd66d3afd1a92f853e4d4ca21439dea1da41c941: Status 404 returned error can't find the container with id fcb6049175d618686bd24850fd66d3afd1a92f853e4d4ca21439dea1da41c941 Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.829203 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.831825 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.834572 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.837589 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.837865 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-fx2vn" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.837962 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.838115 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.838185 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.838270 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-5n4f4oh7gp52d" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.838471 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.838573 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.839329 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.840850 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.848372 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.856760 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.862363 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.897733 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.897889 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.897955 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.897980 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898100 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898143 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898176 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898200 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898217 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-web-config\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898242 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-config\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898273 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898601 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898624 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7rtg\" (UniqueName: \"kubernetes.io/projected/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-kube-api-access-z7rtg\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898660 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898709 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-config-out\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898755 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898792 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.898837 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.999826 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:05 crc kubenswrapper[4721]: I0202 13:08:05.999878 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:05.999902 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-web-config\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:05.999922 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-config\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:05.999941 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:05.999971 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:05.999986 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7rtg\" (UniqueName: \"kubernetes.io/projected/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-kube-api-access-z7rtg\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000007 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000023 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-config-out\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000041 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000058 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000099 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000122 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000143 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000165 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000180 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000210 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000228 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.000981 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.001369 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.002033 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.006775 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.006786 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-config\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.006918 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.007560 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.009220 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.010081 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.010214 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.010397 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.010475 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.010498 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.010568 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.011900 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-config-out\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.013166 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.022766 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-web-config\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.027980 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7rtg\" (UniqueName: \"kubernetes.io/projected/6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49-kube-api-access-z7rtg\") pod \"prometheus-k8s-0\" (UID: \"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.068862 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c598d788-j9gjl" event={"ID":"848aa930-7630-4eed-b114-23853a30daac","Type":"ContainerStarted","Data":"fcb6049175d618686bd24850fd66d3afd1a92f853e4d4ca21439dea1da41c941"} Feb 02 13:08:06 crc kubenswrapper[4721]: I0202 13:08:06.150276 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.022693 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67df565f78-vmr8d"] Feb 02 13:08:07 crc kubenswrapper[4721]: W0202 13:08:07.031016 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b390361_5a0f_423e_856c_dc0e11c32afa.slice/crio-502db066fbe14662fbccb4b34c2e02e4a16454b2d8f5a88b36628f40bbb90b1a WatchSource:0}: Error finding container 502db066fbe14662fbccb4b34c2e02e4a16454b2d8f5a88b36628f40bbb90b1a: Status 404 returned error can't find the container with id 502db066fbe14662fbccb4b34c2e02e4a16454b2d8f5a88b36628f40bbb90b1a Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.079671 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c598d788-j9gjl" event={"ID":"848aa930-7630-4eed-b114-23853a30daac","Type":"ContainerStarted","Data":"98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32"} Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.083245 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"457655c2-194f-4643-804e-1024580bb2dc","Type":"ContainerStarted","Data":"6014a97404cf5ad936fba27378735c16a178bbda67ab9b8ee5d23e279778974b"} Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.083287 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"457655c2-194f-4643-804e-1024580bb2dc","Type":"ContainerStarted","Data":"e5be9ae087a265784a87924481dbf0d70d6cc88c0165438f8f649e806c47ce68"} Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.089926 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" event={"ID":"4b390361-5a0f-423e-856c-dc0e11c32afa","Type":"ContainerStarted","Data":"502db066fbe14662fbccb4b34c2e02e4a16454b2d8f5a88b36628f40bbb90b1a"} Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.093426 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" event={"ID":"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6","Type":"ContainerStarted","Data":"3b95b2e35c0e742e5d024aecd741781a45ed671ce2a62f655e84b5058c145d03"} Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.093468 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" event={"ID":"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6","Type":"ContainerStarted","Data":"bf0a86d1164c453faf37d61e4a79446e93c9dba37cb16d4b6fe5e036b7743d6e"} Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.100527 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7"] Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.167916 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56c598d788-j9gjl" podStartSLOduration=3.167892334 podStartE2EDuration="3.167892334s" podCreationTimestamp="2026-02-02 13:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:08:07.108740246 +0000 UTC m=+427.411254645" watchObservedRunningTime="2026-02-02 13:08:07.167892334 +0000 UTC m=+427.470406723" Feb 02 13:08:07 crc kubenswrapper[4721]: I0202 13:08:07.170239 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 02 13:08:07 crc kubenswrapper[4721]: W0202 13:08:07.178253 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fdb4e96_b41b_4d7e_8a97_c0ed2705ed49.slice/crio-f5d0b5dc0a903cd81476b7fca9028d99ceb6595f223e4e576b99ebc2e7e736ac WatchSource:0}: Error finding container f5d0b5dc0a903cd81476b7fca9028d99ceb6595f223e4e576b99ebc2e7e736ac: Status 404 returned error can't find the container with id f5d0b5dc0a903cd81476b7fca9028d99ceb6595f223e4e576b99ebc2e7e736ac Feb 02 13:08:08 crc kubenswrapper[4721]: I0202 13:08:08.108344 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" event={"ID":"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6","Type":"ContainerStarted","Data":"6f5be74219f20ff773d039749d933e7c2cfc283210466387fdc8b0bb405fd3ee"} Feb 02 13:08:08 crc kubenswrapper[4721]: I0202 13:08:08.110409 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" event={"ID":"bf61d6b4-c01d-487e-bc9d-63d6b7654ce8","Type":"ContainerStarted","Data":"a0e1e6aaba0fac91908a6a08f3ac88027b1217e86b61ed748f66fa256466235d"} Feb 02 13:08:08 crc kubenswrapper[4721]: I0202 13:08:08.117807 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"457655c2-194f-4643-804e-1024580bb2dc","Type":"ContainerStarted","Data":"e686c72aa34ec5e05d89ec828e5f4d9549b3fba03a08f608dd269fa494337b4e"} Feb 02 13:08:08 crc kubenswrapper[4721]: I0202 13:08:08.117853 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"457655c2-194f-4643-804e-1024580bb2dc","Type":"ContainerStarted","Data":"dd09cf080dd6a71f2040e0dbe1ea210b2af0170ab715b399c00d8440e164100c"} Feb 02 13:08:08 crc kubenswrapper[4721]: I0202 13:08:08.117868 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"457655c2-194f-4643-804e-1024580bb2dc","Type":"ContainerStarted","Data":"c4130c3a7ff7eeafacdfd6a229692ab0739ce9e0bb486c1e29d9b9d427354026"} Feb 02 13:08:08 crc kubenswrapper[4721]: I0202 13:08:08.119704 4721 generic.go:334] "Generic (PLEG): container finished" podID="6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49" containerID="1c818404e0faa701e4e3f47978d2fa944d98d0b6c0d030c1cda3ff2480c99ac1" exitCode=0 Feb 02 13:08:08 crc kubenswrapper[4721]: I0202 13:08:08.119737 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49","Type":"ContainerDied","Data":"1c818404e0faa701e4e3f47978d2fa944d98d0b6c0d030c1cda3ff2480c99ac1"} Feb 02 13:08:08 crc kubenswrapper[4721]: I0202 13:08:08.119766 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49","Type":"ContainerStarted","Data":"f5d0b5dc0a903cd81476b7fca9028d99ceb6595f223e4e576b99ebc2e7e736ac"} Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.134164 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" event={"ID":"bf61d6b4-c01d-487e-bc9d-63d6b7654ce8","Type":"ContainerStarted","Data":"2b0c7f1694516f76b0d9f071addf7def8ca086434f1c73016cb2c92bf9e26304"} Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.134498 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.139694 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"457655c2-194f-4643-804e-1024580bb2dc","Type":"ContainerStarted","Data":"49f640be331dbb1f8380b34c42bc0c7e161ddced43c334595377ba9590eb0260"} Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.141203 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" event={"ID":"4b390361-5a0f-423e-856c-dc0e11c32afa","Type":"ContainerStarted","Data":"ccd743f96e31ef403c4d2a2fdec1a59e18855c530b39e212eda186903b39c653"} Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.143835 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.144880 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" event={"ID":"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6","Type":"ContainerStarted","Data":"a05aca6394662561b1863f8833f90ece68c32e66e41bdcd7a2f8391bf9af2dff"} Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.144900 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" event={"ID":"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6","Type":"ContainerStarted","Data":"c34fb08a72cd0bd69ce0ac37e22dd1057bf4b2ffdcd638b51f9dfd275204a829"} Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.144910 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" event={"ID":"8e14e0ea-313a-415a-b59f-b3f1a6f1c7a6","Type":"ContainerStarted","Data":"7cb9fb54d27010886f81f94f5e5036f880f9c2bfca8528ca0e738eaadde15c2d"} Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.145048 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.152751 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-f7c547bd9-pxbl7" podStartSLOduration=2.959046082 podStartE2EDuration="5.152725268s" podCreationTimestamp="2026-02-02 13:08:05 +0000 UTC" firstStartedPulling="2026-02-02 13:08:07.10816758 +0000 UTC m=+427.410681969" lastFinishedPulling="2026-02-02 13:08:09.301846766 +0000 UTC m=+429.604361155" observedRunningTime="2026-02-02 13:08:10.149050627 +0000 UTC m=+430.451565026" watchObservedRunningTime="2026-02-02 13:08:10.152725268 +0000 UTC m=+430.455239657" Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.204731 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.539774546 podStartE2EDuration="10.204705167s" podCreationTimestamp="2026-02-02 13:08:00 +0000 UTC" firstStartedPulling="2026-02-02 13:08:01.367236297 +0000 UTC m=+421.669750686" lastFinishedPulling="2026-02-02 13:08:09.032166878 +0000 UTC m=+429.334681307" observedRunningTime="2026-02-02 13:08:10.200412508 +0000 UTC m=+430.502926917" watchObservedRunningTime="2026-02-02 13:08:10.204705167 +0000 UTC m=+430.507219556" Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.222261 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" podStartSLOduration=3.457308395 podStartE2EDuration="9.222243043s" podCreationTimestamp="2026-02-02 13:08:01 +0000 UTC" firstStartedPulling="2026-02-02 13:08:03.295430251 +0000 UTC m=+423.597944640" lastFinishedPulling="2026-02-02 13:08:09.060364899 +0000 UTC m=+429.362879288" observedRunningTime="2026-02-02 13:08:10.221966586 +0000 UTC m=+430.524480975" watchObservedRunningTime="2026-02-02 13:08:10.222243043 +0000 UTC m=+430.524757442" Feb 02 13:08:10 crc kubenswrapper[4721]: I0202 13:08:10.241287 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" podStartSLOduration=3.997268119 podStartE2EDuration="6.241272359s" podCreationTimestamp="2026-02-02 13:08:04 +0000 UTC" firstStartedPulling="2026-02-02 13:08:07.033249205 +0000 UTC m=+427.335763594" lastFinishedPulling="2026-02-02 13:08:09.277253445 +0000 UTC m=+429.579767834" observedRunningTime="2026-02-02 13:08:10.241053354 +0000 UTC m=+430.543567743" watchObservedRunningTime="2026-02-02 13:08:10.241272359 +0000 UTC m=+430.543786748" Feb 02 13:08:11 crc kubenswrapper[4721]: I0202 13:08:11.159972 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-c45b586c-w7sw7" Feb 02 13:08:12 crc kubenswrapper[4721]: I0202 13:08:12.162881 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49","Type":"ContainerStarted","Data":"8034925493acc9a842b3362da631ba2e378e3ea6e53347bfa87f5a9bf5d1f315"} Feb 02 13:08:13 crc kubenswrapper[4721]: I0202 13:08:13.171859 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49","Type":"ContainerStarted","Data":"76b4303d16208fc1710fdb661ae395c4a9c0ba9d682c5ef2419bdd5d3f775032"} Feb 02 13:08:13 crc kubenswrapper[4721]: I0202 13:08:13.171900 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49","Type":"ContainerStarted","Data":"029730ca64213cc0c7271b36994601e0c25adab46b40896e2ef4be464f4d9813"} Feb 02 13:08:13 crc kubenswrapper[4721]: I0202 13:08:13.171915 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49","Type":"ContainerStarted","Data":"edbb97c0330ed4b68671f1be1e988325f2b163687c2abdab120af47ffd68d8cd"} Feb 02 13:08:13 crc kubenswrapper[4721]: I0202 13:08:13.171929 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49","Type":"ContainerStarted","Data":"22bd3c7b420c6d7469ebc29d365aa850fb718e42a5c2bed75986f1e0311844f8"} Feb 02 13:08:13 crc kubenswrapper[4721]: I0202 13:08:13.171940 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6fdb4e96-b41b-4d7e-8a97-c0ed2705ed49","Type":"ContainerStarted","Data":"7482981fa8cc59d4faf4f647912a9f2d95a317750854ed3924c3f6c0e2b62927"} Feb 02 13:08:13 crc kubenswrapper[4721]: I0202 13:08:13.205170 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.445472693 podStartE2EDuration="8.205154703s" podCreationTimestamp="2026-02-02 13:08:05 +0000 UTC" firstStartedPulling="2026-02-02 13:08:08.145595008 +0000 UTC m=+428.448109397" lastFinishedPulling="2026-02-02 13:08:11.905277018 +0000 UTC m=+432.207791407" observedRunningTime="2026-02-02 13:08:13.200886495 +0000 UTC m=+433.503400904" watchObservedRunningTime="2026-02-02 13:08:13.205154703 +0000 UTC m=+433.507669092" Feb 02 13:08:14 crc kubenswrapper[4721]: I0202 13:08:14.604950 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:14 crc kubenswrapper[4721]: I0202 13:08:14.605348 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:14 crc kubenswrapper[4721]: I0202 13:08:14.610731 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:14 crc kubenswrapper[4721]: I0202 13:08:14.764667 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:08:14 crc kubenswrapper[4721]: I0202 13:08:14.764743 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:08:14 crc kubenswrapper[4721]: I0202 13:08:14.764799 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:08:14 crc kubenswrapper[4721]: I0202 13:08:14.765638 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b87a5ff0b46a353772eefb47418659c5151aa34831e5ecd0ccf0f601bac1ccfe"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:08:14 crc kubenswrapper[4721]: I0202 13:08:14.765747 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://b87a5ff0b46a353772eefb47418659c5151aa34831e5ecd0ccf0f601bac1ccfe" gracePeriod=600 Feb 02 13:08:15 crc kubenswrapper[4721]: I0202 13:08:15.185734 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="b87a5ff0b46a353772eefb47418659c5151aa34831e5ecd0ccf0f601bac1ccfe" exitCode=0 Feb 02 13:08:15 crc kubenswrapper[4721]: I0202 13:08:15.185799 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"b87a5ff0b46a353772eefb47418659c5151aa34831e5ecd0ccf0f601bac1ccfe"} Feb 02 13:08:15 crc kubenswrapper[4721]: I0202 13:08:15.186126 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"014ac4f70cadb2e5ed3977a4b883172b9e9190b5cdf25295500702abdd38ede7"} Feb 02 13:08:15 crc kubenswrapper[4721]: I0202 13:08:15.186172 4721 scope.go:117] "RemoveContainer" containerID="142fea12fc056e370e6c152f1358ffca7f5345b290b58801d6ec8ca042262591" Feb 02 13:08:15 crc kubenswrapper[4721]: I0202 13:08:15.192244 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:08:15 crc kubenswrapper[4721]: I0202 13:08:15.255875 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2dsnx"] Feb 02 13:08:16 crc kubenswrapper[4721]: I0202 13:08:16.151425 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:08:25 crc kubenswrapper[4721]: I0202 13:08:25.140029 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:25 crc kubenswrapper[4721]: I0202 13:08:25.140574 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.313644 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-2dsnx" podUID="ae3f417e-2bae-44dd-973f-5314b6f64972" containerName="console" containerID="cri-o://f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a" gracePeriod=15 Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.666100 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2dsnx_ae3f417e-2bae-44dd-973f-5314b6f64972/console/0.log" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.666177 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.737420 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-trusted-ca-bundle\") pod \"ae3f417e-2bae-44dd-973f-5314b6f64972\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.737764 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9h96\" (UniqueName: \"kubernetes.io/projected/ae3f417e-2bae-44dd-973f-5314b6f64972-kube-api-access-w9h96\") pod \"ae3f417e-2bae-44dd-973f-5314b6f64972\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.737822 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-oauth-serving-cert\") pod \"ae3f417e-2bae-44dd-973f-5314b6f64972\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.737852 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-service-ca\") pod \"ae3f417e-2bae-44dd-973f-5314b6f64972\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.737922 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-serving-cert\") pod \"ae3f417e-2bae-44dd-973f-5314b6f64972\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.737943 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-console-config\") pod \"ae3f417e-2bae-44dd-973f-5314b6f64972\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.737959 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-oauth-config\") pod \"ae3f417e-2bae-44dd-973f-5314b6f64972\" (UID: \"ae3f417e-2bae-44dd-973f-5314b6f64972\") " Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.738940 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-console-config" (OuterVolumeSpecName: "console-config") pod "ae3f417e-2bae-44dd-973f-5314b6f64972" (UID: "ae3f417e-2bae-44dd-973f-5314b6f64972"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.738951 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ae3f417e-2bae-44dd-973f-5314b6f64972" (UID: "ae3f417e-2bae-44dd-973f-5314b6f64972"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.739130 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-service-ca" (OuterVolumeSpecName: "service-ca") pod "ae3f417e-2bae-44dd-973f-5314b6f64972" (UID: "ae3f417e-2bae-44dd-973f-5314b6f64972"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.739459 4721 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.739535 4721 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.739599 4721 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.741900 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ae3f417e-2bae-44dd-973f-5314b6f64972" (UID: "ae3f417e-2bae-44dd-973f-5314b6f64972"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.743605 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ae3f417e-2bae-44dd-973f-5314b6f64972" (UID: "ae3f417e-2bae-44dd-973f-5314b6f64972"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.744755 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ae3f417e-2bae-44dd-973f-5314b6f64972" (UID: "ae3f417e-2bae-44dd-973f-5314b6f64972"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.745819 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3f417e-2bae-44dd-973f-5314b6f64972-kube-api-access-w9h96" (OuterVolumeSpecName: "kube-api-access-w9h96") pod "ae3f417e-2bae-44dd-973f-5314b6f64972" (UID: "ae3f417e-2bae-44dd-973f-5314b6f64972"). InnerVolumeSpecName "kube-api-access-w9h96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.841045 4721 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.841113 4721 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae3f417e-2bae-44dd-973f-5314b6f64972-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.841126 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae3f417e-2bae-44dd-973f-5314b6f64972-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:08:40 crc kubenswrapper[4721]: I0202 13:08:40.841138 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9h96\" (UniqueName: \"kubernetes.io/projected/ae3f417e-2bae-44dd-973f-5314b6f64972-kube-api-access-w9h96\") on node \"crc\" DevicePath \"\"" Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.348939 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-2dsnx_ae3f417e-2bae-44dd-973f-5314b6f64972/console/0.log" Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.349026 4721 generic.go:334] "Generic (PLEG): container finished" podID="ae3f417e-2bae-44dd-973f-5314b6f64972" containerID="f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a" exitCode=2 Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.349102 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2dsnx" event={"ID":"ae3f417e-2bae-44dd-973f-5314b6f64972","Type":"ContainerDied","Data":"f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a"} Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.349148 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-2dsnx" Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.349172 4721 scope.go:117] "RemoveContainer" containerID="f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a" Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.349152 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-2dsnx" event={"ID":"ae3f417e-2bae-44dd-973f-5314b6f64972","Type":"ContainerDied","Data":"03d46b30c74ef5c8430b448c7ad678889af4272d7acbd48aceae6628fd4f71b5"} Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.365639 4721 scope.go:117] "RemoveContainer" containerID="f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a" Feb 02 13:08:41 crc kubenswrapper[4721]: E0202 13:08:41.366207 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a\": container with ID starting with f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a not found: ID does not exist" containerID="f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a" Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.366255 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a"} err="failed to get container status \"f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a\": rpc error: code = NotFound desc = could not find container \"f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a\": container with ID starting with f6b3ae8c770ea590746df91083a751d3ccb8e36d4619a57121eebb509e13924a not found: ID does not exist" Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.389412 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-2dsnx"] Feb 02 13:08:41 crc kubenswrapper[4721]: I0202 13:08:41.395114 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-2dsnx"] Feb 02 13:08:42 crc kubenswrapper[4721]: I0202 13:08:42.418158 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae3f417e-2bae-44dd-973f-5314b6f64972" path="/var/lib/kubelet/pods/ae3f417e-2bae-44dd-973f-5314b6f64972/volumes" Feb 02 13:08:45 crc kubenswrapper[4721]: I0202 13:08:45.145466 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:08:45 crc kubenswrapper[4721]: I0202 13:08:45.150385 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-67df565f78-vmr8d" Feb 02 13:09:06 crc kubenswrapper[4721]: I0202 13:09:06.151597 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:09:06 crc kubenswrapper[4721]: I0202 13:09:06.182537 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:09:06 crc kubenswrapper[4721]: I0202 13:09:06.553667 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.896025 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-858d4f646b-v8xpv"] Feb 02 13:09:45 crc kubenswrapper[4721]: E0202 13:09:45.896947 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3f417e-2bae-44dd-973f-5314b6f64972" containerName="console" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.896965 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3f417e-2bae-44dd-973f-5314b6f64972" containerName="console" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.897093 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3f417e-2bae-44dd-973f-5314b6f64972" containerName="console" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.897644 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.913478 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-858d4f646b-v8xpv"] Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.962792 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-trusted-ca-bundle\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.962869 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-658px\" (UniqueName: \"kubernetes.io/projected/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-kube-api-access-658px\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.962906 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-oauth-config\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.962927 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-oauth-serving-cert\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.962954 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-serving-cert\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.963006 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-config\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:45 crc kubenswrapper[4721]: I0202 13:09:45.963029 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-service-ca\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.063750 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-config\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.063810 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-service-ca\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.063862 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-trusted-ca-bundle\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.063895 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-658px\" (UniqueName: \"kubernetes.io/projected/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-kube-api-access-658px\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.063920 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-oauth-config\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.063936 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-oauth-serving-cert\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.063954 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-serving-cert\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.064847 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-service-ca\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.065082 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-trusted-ca-bundle\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.065377 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-oauth-serving-cert\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.065473 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-config\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.070686 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-oauth-config\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.070834 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-serving-cert\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.079734 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-658px\" (UniqueName: \"kubernetes.io/projected/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-kube-api-access-658px\") pod \"console-858d4f646b-v8xpv\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.215117 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.422293 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-858d4f646b-v8xpv"] Feb 02 13:09:46 crc kubenswrapper[4721]: W0202 13:09:46.425968 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cb1d5e0_e67a_459b_ad6a_794d2f8bab70.slice/crio-550ae8a7d8f413e415a65a6ec4a23601971fd9d7d7d542219e95480b30d156d6 WatchSource:0}: Error finding container 550ae8a7d8f413e415a65a6ec4a23601971fd9d7d7d542219e95480b30d156d6: Status 404 returned error can't find the container with id 550ae8a7d8f413e415a65a6ec4a23601971fd9d7d7d542219e95480b30d156d6 Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.762165 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858d4f646b-v8xpv" event={"ID":"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70","Type":"ContainerStarted","Data":"dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66"} Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.762501 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858d4f646b-v8xpv" event={"ID":"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70","Type":"ContainerStarted","Data":"550ae8a7d8f413e415a65a6ec4a23601971fd9d7d7d542219e95480b30d156d6"} Feb 02 13:09:46 crc kubenswrapper[4721]: I0202 13:09:46.783156 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-858d4f646b-v8xpv" podStartSLOduration=1.7827234 podStartE2EDuration="1.7827234s" podCreationTimestamp="2026-02-02 13:09:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:09:46.78049025 +0000 UTC m=+527.083004649" watchObservedRunningTime="2026-02-02 13:09:46.7827234 +0000 UTC m=+527.085237809" Feb 02 13:09:56 crc kubenswrapper[4721]: I0202 13:09:56.216300 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:56 crc kubenswrapper[4721]: I0202 13:09:56.217282 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:56 crc kubenswrapper[4721]: I0202 13:09:56.222126 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:56 crc kubenswrapper[4721]: I0202 13:09:56.823322 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:09:56 crc kubenswrapper[4721]: I0202 13:09:56.883212 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56c598d788-j9gjl"] Feb 02 13:10:21 crc kubenswrapper[4721]: I0202 13:10:21.924281 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-56c598d788-j9gjl" podUID="848aa930-7630-4eed-b114-23853a30daac" containerName="console" containerID="cri-o://98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32" gracePeriod=15 Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.330736 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56c598d788-j9gjl_848aa930-7630-4eed-b114-23853a30daac/console/0.log" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.331121 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.398087 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-trusted-ca-bundle\") pod \"848aa930-7630-4eed-b114-23853a30daac\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.398139 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-oauth-serving-cert\") pod \"848aa930-7630-4eed-b114-23853a30daac\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.399134 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "848aa930-7630-4eed-b114-23853a30daac" (UID: "848aa930-7630-4eed-b114-23853a30daac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.399144 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "848aa930-7630-4eed-b114-23853a30daac" (UID: "848aa930-7630-4eed-b114-23853a30daac"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.399148 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st584\" (UniqueName: \"kubernetes.io/projected/848aa930-7630-4eed-b114-23853a30daac-kube-api-access-st584\") pod \"848aa930-7630-4eed-b114-23853a30daac\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.399256 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-service-ca\") pod \"848aa930-7630-4eed-b114-23853a30daac\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.399289 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-oauth-config\") pod \"848aa930-7630-4eed-b114-23853a30daac\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.399316 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-serving-cert\") pod \"848aa930-7630-4eed-b114-23853a30daac\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.399344 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-console-config\") pod \"848aa930-7630-4eed-b114-23853a30daac\" (UID: \"848aa930-7630-4eed-b114-23853a30daac\") " Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.399831 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-service-ca" (OuterVolumeSpecName: "service-ca") pod "848aa930-7630-4eed-b114-23853a30daac" (UID: "848aa930-7630-4eed-b114-23853a30daac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.400103 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-console-config" (OuterVolumeSpecName: "console-config") pod "848aa930-7630-4eed-b114-23853a30daac" (UID: "848aa930-7630-4eed-b114-23853a30daac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.400221 4721 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.400244 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.400257 4721 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.404410 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "848aa930-7630-4eed-b114-23853a30daac" (UID: "848aa930-7630-4eed-b114-23853a30daac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.404615 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848aa930-7630-4eed-b114-23853a30daac-kube-api-access-st584" (OuterVolumeSpecName: "kube-api-access-st584") pod "848aa930-7630-4eed-b114-23853a30daac" (UID: "848aa930-7630-4eed-b114-23853a30daac"). InnerVolumeSpecName "kube-api-access-st584". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.404675 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "848aa930-7630-4eed-b114-23853a30daac" (UID: "848aa930-7630-4eed-b114-23853a30daac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.501628 4721 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.501669 4721 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/848aa930-7630-4eed-b114-23853a30daac-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.501679 4721 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/848aa930-7630-4eed-b114-23853a30daac-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.501691 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st584\" (UniqueName: \"kubernetes.io/projected/848aa930-7630-4eed-b114-23853a30daac-kube-api-access-st584\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.967416 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56c598d788-j9gjl_848aa930-7630-4eed-b114-23853a30daac/console/0.log" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.967712 4721 generic.go:334] "Generic (PLEG): container finished" podID="848aa930-7630-4eed-b114-23853a30daac" containerID="98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32" exitCode=2 Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.967747 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c598d788-j9gjl" event={"ID":"848aa930-7630-4eed-b114-23853a30daac","Type":"ContainerDied","Data":"98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32"} Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.967781 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56c598d788-j9gjl" event={"ID":"848aa930-7630-4eed-b114-23853a30daac","Type":"ContainerDied","Data":"fcb6049175d618686bd24850fd66d3afd1a92f853e4d4ca21439dea1da41c941"} Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.967807 4721 scope.go:117] "RemoveContainer" containerID="98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.967817 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56c598d788-j9gjl" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.986549 4721 scope.go:117] "RemoveContainer" containerID="98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32" Feb 02 13:10:22 crc kubenswrapper[4721]: E0202 13:10:22.987044 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32\": container with ID starting with 98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32 not found: ID does not exist" containerID="98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.987105 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32"} err="failed to get container status \"98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32\": rpc error: code = NotFound desc = could not find container \"98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32\": container with ID starting with 98c0eac20465229d8bff6225571570b540d801c33450e2e81bc8f079fbc7cf32 not found: ID does not exist" Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.989954 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56c598d788-j9gjl"] Feb 02 13:10:22 crc kubenswrapper[4721]: I0202 13:10:22.992201 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56c598d788-j9gjl"] Feb 02 13:10:24 crc kubenswrapper[4721]: I0202 13:10:24.417173 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848aa930-7630-4eed-b114-23853a30daac" path="/var/lib/kubelet/pods/848aa930-7630-4eed-b114-23853a30daac/volumes" Feb 02 13:10:44 crc kubenswrapper[4721]: I0202 13:10:44.763931 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:10:44 crc kubenswrapper[4721]: I0202 13:10:44.764404 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:11:14 crc kubenswrapper[4721]: I0202 13:11:14.764319 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:11:14 crc kubenswrapper[4721]: I0202 13:11:14.765038 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:11:44 crc kubenswrapper[4721]: I0202 13:11:44.764058 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:11:44 crc kubenswrapper[4721]: I0202 13:11:44.764701 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:11:44 crc kubenswrapper[4721]: I0202 13:11:44.764760 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:11:44 crc kubenswrapper[4721]: I0202 13:11:44.765503 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"014ac4f70cadb2e5ed3977a4b883172b9e9190b5cdf25295500702abdd38ede7"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:11:44 crc kubenswrapper[4721]: I0202 13:11:44.765563 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://014ac4f70cadb2e5ed3977a4b883172b9e9190b5cdf25295500702abdd38ede7" gracePeriod=600 Feb 02 13:11:45 crc kubenswrapper[4721]: I0202 13:11:45.477588 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"014ac4f70cadb2e5ed3977a4b883172b9e9190b5cdf25295500702abdd38ede7"} Feb 02 13:11:45 crc kubenswrapper[4721]: I0202 13:11:45.478200 4721 scope.go:117] "RemoveContainer" containerID="b87a5ff0b46a353772eefb47418659c5151aa34831e5ecd0ccf0f601bac1ccfe" Feb 02 13:11:45 crc kubenswrapper[4721]: I0202 13:11:45.477540 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="014ac4f70cadb2e5ed3977a4b883172b9e9190b5cdf25295500702abdd38ede7" exitCode=0 Feb 02 13:11:45 crc kubenswrapper[4721]: I0202 13:11:45.478333 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"4e271bf7e19d8205d47335a427c173d1e8d60e0f2a6167b224679306973cc1cc"} Feb 02 13:12:08 crc kubenswrapper[4721]: I0202 13:12:08.859794 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj"] Feb 02 13:12:08 crc kubenswrapper[4721]: E0202 13:12:08.860412 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848aa930-7630-4eed-b114-23853a30daac" containerName="console" Feb 02 13:12:08 crc kubenswrapper[4721]: I0202 13:12:08.860423 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="848aa930-7630-4eed-b114-23853a30daac" containerName="console" Feb 02 13:12:08 crc kubenswrapper[4721]: I0202 13:12:08.860525 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="848aa930-7630-4eed-b114-23853a30daac" containerName="console" Feb 02 13:12:08 crc kubenswrapper[4721]: I0202 13:12:08.861322 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:08 crc kubenswrapper[4721]: I0202 13:12:08.862849 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 13:12:08 crc kubenswrapper[4721]: I0202 13:12:08.872361 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj"] Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.016981 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.017035 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdgvg\" (UniqueName: \"kubernetes.io/projected/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-kube-api-access-cdgvg\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.017153 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.118386 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.118538 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.118596 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdgvg\" (UniqueName: \"kubernetes.io/projected/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-kube-api-access-cdgvg\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.118990 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.118990 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.141360 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdgvg\" (UniqueName: \"kubernetes.io/projected/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-kube-api-access-cdgvg\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.181956 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.588355 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj"] Feb 02 13:12:09 crc kubenswrapper[4721]: I0202 13:12:09.647712 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" event={"ID":"6b6f1f89-2c62-4c26-abd3-2d105289fc8c","Type":"ContainerStarted","Data":"f4cef8bac972b217cde6a5c6238cad0bafecad68a72372f37b1a4bb8d224d7aa"} Feb 02 13:12:10 crc kubenswrapper[4721]: I0202 13:12:10.654829 4721 generic.go:334] "Generic (PLEG): container finished" podID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerID="b55b1e33f819c0c2f22ef5379434b120bd261d0b70b6b470a047653e44c7773e" exitCode=0 Feb 02 13:12:10 crc kubenswrapper[4721]: I0202 13:12:10.654947 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" event={"ID":"6b6f1f89-2c62-4c26-abd3-2d105289fc8c","Type":"ContainerDied","Data":"b55b1e33f819c0c2f22ef5379434b120bd261d0b70b6b470a047653e44c7773e"} Feb 02 13:12:10 crc kubenswrapper[4721]: I0202 13:12:10.658144 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:12:12 crc kubenswrapper[4721]: I0202 13:12:12.668761 4721 generic.go:334] "Generic (PLEG): container finished" podID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerID="fa5963647cb5da5e7b22f8c4fc93bba86a1ea910c234da502ff074c7aae60cc6" exitCode=0 Feb 02 13:12:12 crc kubenswrapper[4721]: I0202 13:12:12.668803 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" event={"ID":"6b6f1f89-2c62-4c26-abd3-2d105289fc8c","Type":"ContainerDied","Data":"fa5963647cb5da5e7b22f8c4fc93bba86a1ea910c234da502ff074c7aae60cc6"} Feb 02 13:12:13 crc kubenswrapper[4721]: I0202 13:12:13.676674 4721 generic.go:334] "Generic (PLEG): container finished" podID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerID="14a0661444512e8d743ac85d013769c639534bc5f2b5c95c34e6575b49d7b455" exitCode=0 Feb 02 13:12:13 crc kubenswrapper[4721]: I0202 13:12:13.676738 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" event={"ID":"6b6f1f89-2c62-4c26-abd3-2d105289fc8c","Type":"ContainerDied","Data":"14a0661444512e8d743ac85d013769c639534bc5f2b5c95c34e6575b49d7b455"} Feb 02 13:12:14 crc kubenswrapper[4721]: I0202 13:12:14.924655 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.017955 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-bundle\") pod \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.018057 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdgvg\" (UniqueName: \"kubernetes.io/projected/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-kube-api-access-cdgvg\") pod \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.018215 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-util\") pod \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\" (UID: \"6b6f1f89-2c62-4c26-abd3-2d105289fc8c\") " Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.019910 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-bundle" (OuterVolumeSpecName: "bundle") pod "6b6f1f89-2c62-4c26-abd3-2d105289fc8c" (UID: "6b6f1f89-2c62-4c26-abd3-2d105289fc8c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.026753 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-kube-api-access-cdgvg" (OuterVolumeSpecName: "kube-api-access-cdgvg") pod "6b6f1f89-2c62-4c26-abd3-2d105289fc8c" (UID: "6b6f1f89-2c62-4c26-abd3-2d105289fc8c"). InnerVolumeSpecName "kube-api-access-cdgvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.119285 4721 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.119334 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdgvg\" (UniqueName: \"kubernetes.io/projected/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-kube-api-access-cdgvg\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.324711 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-util" (OuterVolumeSpecName: "util") pod "6b6f1f89-2c62-4c26-abd3-2d105289fc8c" (UID: "6b6f1f89-2c62-4c26-abd3-2d105289fc8c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.425331 4721 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b6f1f89-2c62-4c26-abd3-2d105289fc8c-util\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.694498 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" event={"ID":"6b6f1f89-2c62-4c26-abd3-2d105289fc8c","Type":"ContainerDied","Data":"f4cef8bac972b217cde6a5c6238cad0bafecad68a72372f37b1a4bb8d224d7aa"} Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.694542 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4cef8bac972b217cde6a5c6238cad0bafecad68a72372f37b1a4bb8d224d7aa" Feb 02 13:12:15 crc kubenswrapper[4721]: I0202 13:12:15.694557 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj" Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.054451 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pwcs2"] Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.055473 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovn-controller" containerID="cri-o://30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8" gracePeriod=30 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.055837 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="sbdb" containerID="cri-o://27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a" gracePeriod=30 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.055880 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="nbdb" containerID="cri-o://3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5" gracePeriod=30 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.055912 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="northd" containerID="cri-o://677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34" gracePeriod=30 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.055940 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3" gracePeriod=30 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.055994 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kube-rbac-proxy-node" containerID="cri-o://dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6" gracePeriod=30 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.056027 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovn-acl-logging" containerID="cri-o://4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301" gracePeriod=30 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.098972 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" containerID="cri-o://3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464" gracePeriod=30 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.739242 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovnkube-controller/3.log" Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.742005 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovn-acl-logging/0.log" Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.742653 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovn-controller/0.log" Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744303 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464" exitCode=0 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744366 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a" exitCode=0 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744377 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5" exitCode=0 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744386 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34" exitCode=0 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744396 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301" exitCode=143 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744407 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8" exitCode=143 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744385 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464"} Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744511 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a"} Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744532 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5"} Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744550 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34"} Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744564 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301"} Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744575 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8"} Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.744596 4721 scope.go:117] "RemoveContainer" containerID="0ef87877d327f3682656924644afe911470b558fe3fd45dd708e4d6f0aa69f29" Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.747889 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/2.log" Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.749385 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/1.log" Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.749426 4721 generic.go:334] "Generic (PLEG): container finished" podID="5ba84858-caaa-4fba-8eaf-9f7ddece0b3a" containerID="c98892a7ff179bcba871f45746b3c85a83090c01da93d13aeaeb2a282472689d" exitCode=2 Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.749452 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltw7d" event={"ID":"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a","Type":"ContainerDied","Data":"c98892a7ff179bcba871f45746b3c85a83090c01da93d13aeaeb2a282472689d"} Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.749928 4721 scope.go:117] "RemoveContainer" containerID="c98892a7ff179bcba871f45746b3c85a83090c01da93d13aeaeb2a282472689d" Feb 02 13:12:20 crc kubenswrapper[4721]: E0202 13:12:20.750121 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ltw7d_openshift-multus(5ba84858-caaa-4fba-8eaf-9f7ddece0b3a)\"" pod="openshift-multus/multus-ltw7d" podUID="5ba84858-caaa-4fba-8eaf-9f7ddece0b3a" Feb 02 13:12:20 crc kubenswrapper[4721]: I0202 13:12:20.770512 4721 scope.go:117] "RemoveContainer" containerID="3e01f486c1be69ceb8e869e6feefbef4e07fd230f5cb41ec3adfaaba36430569" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.234873 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovn-acl-logging/0.log" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.235333 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovn-controller/0.log" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.235643 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328562 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-netns\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328616 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-var-lib-openvswitch\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328656 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328696 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88f6w\" (UniqueName: \"kubernetes.io/projected/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-kube-api-access-88f6w\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328721 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-log-socket\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328746 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovn-node-metrics-cert\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328765 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-slash\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328791 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-openvswitch\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328768 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328830 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-env-overrides\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328869 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-ovn\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328872 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-log-socket" (OuterVolumeSpecName: "log-socket") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328897 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-kubelet\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328909 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328937 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-script-lib\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328944 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328976 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-bin\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.328997 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-ovn-kubernetes\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329039 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-systemd-units\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329058 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-systemd\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329107 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-config\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329128 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-node-log\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329156 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-etc-openvswitch\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329193 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-netd\") pod \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\" (UID: \"b15bc48d-f88d-4b38-a9e1-00bb00b88a52\") " Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329489 4721 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329504 4721 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329515 4721 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329530 4721 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-log-socket\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.329603 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.330193 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.330265 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-slash" (OuterVolumeSpecName: "host-slash") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.330288 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.330330 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.330364 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.330575 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-node-log" (OuterVolumeSpecName: "node-log") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.330657 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.330982 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.331008 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.331021 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.331045 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.331054 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.341747 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-kube-api-access-88f6w" (OuterVolumeSpecName: "kube-api-access-88f6w") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "kube-api-access-88f6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.373432 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394358 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8dtnt"] Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394670 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="northd" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394688 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="northd" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394700 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovn-acl-logging" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394708 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovn-acl-logging" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394722 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394730 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394741 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394750 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394762 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovn-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394770 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovn-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394782 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kube-rbac-proxy-node" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394789 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kube-rbac-proxy-node" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394803 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kubecfg-setup" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394811 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kubecfg-setup" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394822 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerName="extract" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394831 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerName="extract" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394843 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394849 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394859 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerName="pull" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394868 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerName="pull" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394877 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394884 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394892 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="nbdb" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394899 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="nbdb" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394909 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerName="util" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394915 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerName="util" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.394926 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="sbdb" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.394933 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="sbdb" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395055 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovn-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395070 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395094 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6f1f89-2c62-4c26-abd3-2d105289fc8c" containerName="extract" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395107 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395116 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovn-acl-logging" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395123 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="northd" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395131 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="nbdb" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395150 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="sbdb" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395163 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="kube-rbac-proxy-node" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395173 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395181 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395190 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395202 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.395328 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395337 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.395604 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.395615 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerName="ovnkube-controller" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.396354 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b15bc48d-f88d-4b38-a9e1-00bb00b88a52" (UID: "b15bc48d-f88d-4b38-a9e1-00bb00b88a52"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.397646 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.432246 4721 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.432792 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88f6w\" (UniqueName: \"kubernetes.io/projected/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-kube-api-access-88f6w\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.432859 4721 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.432930 4721 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-slash\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.432996 4721 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433063 4721 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433145 4721 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433207 4721 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433260 4721 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433309 4721 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433356 4721 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433403 4721 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433452 4721 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433506 4721 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433553 4721 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-node-log\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.433598 4721 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b15bc48d-f88d-4b38-a9e1-00bb00b88a52-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.534842 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vntkf\" (UniqueName: \"kubernetes.io/projected/1733f0ca-4783-4099-b66d-b497993def10-kube-api-access-vntkf\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.534899 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-cni-bin\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.534956 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-var-lib-openvswitch\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.534979 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-kubelet\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535004 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-run-ovn-kubernetes\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535025 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-slash\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535047 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-run-systemd\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535111 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-run-openvswitch\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535156 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-cni-netd\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535184 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1733f0ca-4783-4099-b66d-b497993def10-ovnkube-config\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535211 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-run-netns\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535238 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-etc-openvswitch\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535257 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-log-socket\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535285 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1733f0ca-4783-4099-b66d-b497993def10-ovnkube-script-lib\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535306 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-systemd-units\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535331 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-node-log\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535367 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1733f0ca-4783-4099-b66d-b497993def10-env-overrides\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535403 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535428 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1733f0ca-4783-4099-b66d-b497993def10-ovn-node-metrics-cert\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.535452 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-run-ovn\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.636929 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-slash\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637238 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-run-systemd\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637353 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-run-openvswitch\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637418 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-run-openvswitch\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637128 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-slash\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637282 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-run-systemd\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637585 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-cni-netd\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637592 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-cni-netd\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637746 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1733f0ca-4783-4099-b66d-b497993def10-ovnkube-config\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637854 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-run-netns\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.637935 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-etc-openvswitch\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638018 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-log-socket\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638080 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-etc-openvswitch\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638057 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-log-socket\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638005 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-run-netns\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638267 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1733f0ca-4783-4099-b66d-b497993def10-ovnkube-script-lib\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638399 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-systemd-units\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638495 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-node-log\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638558 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-node-log\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638452 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-systemd-units\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638659 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1733f0ca-4783-4099-b66d-b497993def10-ovnkube-config\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638744 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1733f0ca-4783-4099-b66d-b497993def10-env-overrides\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638855 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639152 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1733f0ca-4783-4099-b66d-b497993def10-ovn-node-metrics-cert\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639264 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-run-ovn\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639382 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vntkf\" (UniqueName: \"kubernetes.io/projected/1733f0ca-4783-4099-b66d-b497993def10-kube-api-access-vntkf\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639479 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-cni-bin\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639568 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-cni-bin\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639322 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-run-ovn\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.638947 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639297 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1733f0ca-4783-4099-b66d-b497993def10-env-overrides\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639097 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1733f0ca-4783-4099-b66d-b497993def10-ovnkube-script-lib\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639814 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-var-lib-openvswitch\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639910 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-kubelet\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639978 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-kubelet\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.639861 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-var-lib-openvswitch\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.640190 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-run-ovn-kubernetes\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.640298 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1733f0ca-4783-4099-b66d-b497993def10-host-run-ovn-kubernetes\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.642589 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1733f0ca-4783-4099-b66d-b497993def10-ovn-node-metrics-cert\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.659336 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vntkf\" (UniqueName: \"kubernetes.io/projected/1733f0ca-4783-4099-b66d-b497993def10-kube-api-access-vntkf\") pod \"ovnkube-node-8dtnt\" (UID: \"1733f0ca-4783-4099-b66d-b497993def10\") " pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.722012 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.755580 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerStarted","Data":"916fed5acbc7fa0d10fdd8ecb098433cb96ff7234cffebd73f037c512862f82a"} Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.760154 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovn-acl-logging/0.log" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.760865 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pwcs2_b15bc48d-f88d-4b38-a9e1-00bb00b88a52/ovn-controller/0.log" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.761449 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3" exitCode=0 Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.761482 4721 generic.go:334] "Generic (PLEG): container finished" podID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" containerID="dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6" exitCode=0 Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.761499 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3"} Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.761544 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6"} Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.761557 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" event={"ID":"b15bc48d-f88d-4b38-a9e1-00bb00b88a52","Type":"ContainerDied","Data":"1aec1a12b2d4ba708a47d40a6a1a8e146d69e8b9d3bc97bc1257a9e8fc573862"} Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.761578 4721 scope.go:117] "RemoveContainer" containerID="3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.761632 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pwcs2" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.763617 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/2.log" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.779115 4721 scope.go:117] "RemoveContainer" containerID="27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.801731 4721 scope.go:117] "RemoveContainer" containerID="3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.819283 4721 scope.go:117] "RemoveContainer" containerID="677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.819902 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pwcs2"] Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.824302 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pwcs2"] Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.842737 4721 scope.go:117] "RemoveContainer" containerID="51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.856052 4721 scope.go:117] "RemoveContainer" containerID="dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.870209 4721 scope.go:117] "RemoveContainer" containerID="4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.885351 4721 scope.go:117] "RemoveContainer" containerID="30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.912947 4721 scope.go:117] "RemoveContainer" containerID="406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.937486 4721 scope.go:117] "RemoveContainer" containerID="3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.938072 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464\": container with ID starting with 3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464 not found: ID does not exist" containerID="3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.938114 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464"} err="failed to get container status \"3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464\": rpc error: code = NotFound desc = could not find container \"3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464\": container with ID starting with 3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.938145 4721 scope.go:117] "RemoveContainer" containerID="27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.940181 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\": container with ID starting with 27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a not found: ID does not exist" containerID="27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.940214 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a"} err="failed to get container status \"27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\": rpc error: code = NotFound desc = could not find container \"27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\": container with ID starting with 27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.940246 4721 scope.go:117] "RemoveContainer" containerID="3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.940630 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\": container with ID starting with 3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5 not found: ID does not exist" containerID="3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.940657 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5"} err="failed to get container status \"3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\": rpc error: code = NotFound desc = could not find container \"3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\": container with ID starting with 3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.940684 4721 scope.go:117] "RemoveContainer" containerID="677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.941144 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\": container with ID starting with 677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34 not found: ID does not exist" containerID="677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.941179 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34"} err="failed to get container status \"677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\": rpc error: code = NotFound desc = could not find container \"677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\": container with ID starting with 677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.941202 4721 scope.go:117] "RemoveContainer" containerID="51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.941491 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\": container with ID starting with 51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3 not found: ID does not exist" containerID="51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.941522 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3"} err="failed to get container status \"51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\": rpc error: code = NotFound desc = could not find container \"51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\": container with ID starting with 51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.941544 4721 scope.go:117] "RemoveContainer" containerID="dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.941779 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\": container with ID starting with dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6 not found: ID does not exist" containerID="dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.941805 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6"} err="failed to get container status \"dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\": rpc error: code = NotFound desc = could not find container \"dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\": container with ID starting with dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.941824 4721 scope.go:117] "RemoveContainer" containerID="4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.942058 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\": container with ID starting with 4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301 not found: ID does not exist" containerID="4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.942096 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301"} err="failed to get container status \"4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\": rpc error: code = NotFound desc = could not find container \"4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\": container with ID starting with 4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.942111 4721 scope.go:117] "RemoveContainer" containerID="30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.942311 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\": container with ID starting with 30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8 not found: ID does not exist" containerID="30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.942343 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8"} err="failed to get container status \"30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\": rpc error: code = NotFound desc = could not find container \"30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\": container with ID starting with 30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.942361 4721 scope.go:117] "RemoveContainer" containerID="406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b" Feb 02 13:12:21 crc kubenswrapper[4721]: E0202 13:12:21.942612 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\": container with ID starting with 406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b not found: ID does not exist" containerID="406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.942637 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b"} err="failed to get container status \"406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\": rpc error: code = NotFound desc = could not find container \"406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\": container with ID starting with 406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.942654 4721 scope.go:117] "RemoveContainer" containerID="3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.942826 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464"} err="failed to get container status \"3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464\": rpc error: code = NotFound desc = could not find container \"3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464\": container with ID starting with 3d1bf13cf098a5769bf003590f59136941db8ae4e0650114deb6dfec3a6e6464 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.942849 4721 scope.go:117] "RemoveContainer" containerID="27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.943025 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a"} err="failed to get container status \"27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\": rpc error: code = NotFound desc = could not find container \"27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a\": container with ID starting with 27e989df998ab4c64e1ddde284d04945c0183fac9a5f452ce859af9548ae292a not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.943045 4721 scope.go:117] "RemoveContainer" containerID="3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.943382 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5"} err="failed to get container status \"3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\": rpc error: code = NotFound desc = could not find container \"3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5\": container with ID starting with 3f9377b4f5108a5e0273594062e1c210fc2dbf2e2a27ca130fc2c669d91007b5 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.943404 4721 scope.go:117] "RemoveContainer" containerID="677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.943743 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34"} err="failed to get container status \"677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\": rpc error: code = NotFound desc = could not find container \"677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34\": container with ID starting with 677e97a9c5184f19a782e3fb3881ac573f2cfe36d5965cfb7d85937a0a49ac34 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.943772 4721 scope.go:117] "RemoveContainer" containerID="51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.944044 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3"} err="failed to get container status \"51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\": rpc error: code = NotFound desc = could not find container \"51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3\": container with ID starting with 51ef6bdd231cafcaecaee4d6c7a6ca1c4b54947522020d49e19421c63699eac3 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.944094 4721 scope.go:117] "RemoveContainer" containerID="dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.944409 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6"} err="failed to get container status \"dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\": rpc error: code = NotFound desc = could not find container \"dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6\": container with ID starting with dc83b18fdeff15b470cdfd05ed235d5e1a7b10d3e8a59e8d9ece43c0a4f3f6e6 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.944428 4721 scope.go:117] "RemoveContainer" containerID="4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.944631 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301"} err="failed to get container status \"4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\": rpc error: code = NotFound desc = could not find container \"4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301\": container with ID starting with 4d5822b1d1a041866f584d6ec48869a2aed81ed3bcc7cd6b274875c81b71f301 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.944657 4721 scope.go:117] "RemoveContainer" containerID="30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.944890 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8"} err="failed to get container status \"30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\": rpc error: code = NotFound desc = could not find container \"30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8\": container with ID starting with 30c12d847ad570843fcdf211e47b860d378600023820faa9095937f4ffe713e8 not found: ID does not exist" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.944916 4721 scope.go:117] "RemoveContainer" containerID="406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b" Feb 02 13:12:21 crc kubenswrapper[4721]: I0202 13:12:21.945153 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b"} err="failed to get container status \"406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\": rpc error: code = NotFound desc = could not find container \"406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b\": container with ID starting with 406d5bcd32d4f27950ac751d73dccb89a8807311e8ec53595606b4c10a6dc67b not found: ID does not exist" Feb 02 13:12:22 crc kubenswrapper[4721]: I0202 13:12:22.417435 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15bc48d-f88d-4b38-a9e1-00bb00b88a52" path="/var/lib/kubelet/pods/b15bc48d-f88d-4b38-a9e1-00bb00b88a52/volumes" Feb 02 13:12:22 crc kubenswrapper[4721]: I0202 13:12:22.773544 4721 generic.go:334] "Generic (PLEG): container finished" podID="1733f0ca-4783-4099-b66d-b497993def10" containerID="d5be3aa663efe1316c680faa622ca237b5b86f03f042993c0996d47a7590e74d" exitCode=0 Feb 02 13:12:22 crc kubenswrapper[4721]: I0202 13:12:22.773610 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerDied","Data":"d5be3aa663efe1316c680faa622ca237b5b86f03f042993c0996d47a7590e74d"} Feb 02 13:12:23 crc kubenswrapper[4721]: I0202 13:12:23.782302 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerStarted","Data":"6d3bdaa9a13a718f47d3ec08be258880a297485f5f530643cd81fb43bde5080c"} Feb 02 13:12:23 crc kubenswrapper[4721]: I0202 13:12:23.782639 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerStarted","Data":"ce0c4d525718d4161300ff87f422840d6d7466f6fd67edc8119125210e7679db"} Feb 02 13:12:23 crc kubenswrapper[4721]: I0202 13:12:23.782656 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerStarted","Data":"b05b359c6c24dec978ec5f554107a55e11aabf0c5b2ec336c75619e2d32e34f8"} Feb 02 13:12:23 crc kubenswrapper[4721]: I0202 13:12:23.782691 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerStarted","Data":"bddb1128d3b459560504046faf54d940a06531590b0482d4fa2c39d39ac21d68"} Feb 02 13:12:24 crc kubenswrapper[4721]: I0202 13:12:24.790556 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerStarted","Data":"4e9b834c790e6421a7228961ab52a38a1f1339bcd72b9be601c4f9752feff355"} Feb 02 13:12:24 crc kubenswrapper[4721]: I0202 13:12:24.790800 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerStarted","Data":"9d383725cf59ebeabd80fd7da95fa1c4fd990c1dd13d8362360f03963bc75b72"} Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.265822 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt"] Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.266909 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.268526 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-fm7xc" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.269024 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.269179 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.400822 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx"] Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.401480 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.403292 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.403771 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pcw2\" (UniqueName: \"kubernetes.io/projected/30a5c0d6-c773-4914-a3b1-1654a51817a9-kube-api-access-6pcw2\") pod \"obo-prometheus-operator-68bc856cb9-jltbt\" (UID: \"30a5c0d6-c773-4914-a3b1-1654a51817a9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.404008 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-5h9g6" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.427775 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9"] Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.428829 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.505350 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41c83c75-cfc8-4c33-97cf-484cc7dcd812-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9\" (UID: \"41c83c75-cfc8-4c33-97cf-484cc7dcd812\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.505401 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1f92c36-0e50-485e-a728-7b42f1ab44c4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx\" (UID: \"a1f92c36-0e50-485e-a728-7b42f1ab44c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.505440 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1f92c36-0e50-485e-a728-7b42f1ab44c4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx\" (UID: \"a1f92c36-0e50-485e-a728-7b42f1ab44c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.505672 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pcw2\" (UniqueName: \"kubernetes.io/projected/30a5c0d6-c773-4914-a3b1-1654a51817a9-kube-api-access-6pcw2\") pod \"obo-prometheus-operator-68bc856cb9-jltbt\" (UID: \"30a5c0d6-c773-4914-a3b1-1654a51817a9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.505810 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41c83c75-cfc8-4c33-97cf-484cc7dcd812-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9\" (UID: \"41c83c75-cfc8-4c33-97cf-484cc7dcd812\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.528906 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pcw2\" (UniqueName: \"kubernetes.io/projected/30a5c0d6-c773-4914-a3b1-1654a51817a9-kube-api-access-6pcw2\") pod \"obo-prometheus-operator-68bc856cb9-jltbt\" (UID: \"30a5c0d6-c773-4914-a3b1-1654a51817a9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.583681 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.607280 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1f92c36-0e50-485e-a728-7b42f1ab44c4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx\" (UID: \"a1f92c36-0e50-485e-a728-7b42f1ab44c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.607349 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1f92c36-0e50-485e-a728-7b42f1ab44c4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx\" (UID: \"a1f92c36-0e50-485e-a728-7b42f1ab44c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.607419 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41c83c75-cfc8-4c33-97cf-484cc7dcd812-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9\" (UID: \"41c83c75-cfc8-4c33-97cf-484cc7dcd812\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.607471 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41c83c75-cfc8-4c33-97cf-484cc7dcd812-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9\" (UID: \"41c83c75-cfc8-4c33-97cf-484cc7dcd812\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.608734 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-b2dk9"] Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.611571 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/41c83c75-cfc8-4c33-97cf-484cc7dcd812-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9\" (UID: \"41c83c75-cfc8-4c33-97cf-484cc7dcd812\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.612010 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a1f92c36-0e50-485e-a728-7b42f1ab44c4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx\" (UID: \"a1f92c36-0e50-485e-a728-7b42f1ab44c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.613909 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a1f92c36-0e50-485e-a728-7b42f1ab44c4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx\" (UID: \"a1f92c36-0e50-485e-a728-7b42f1ab44c4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.615518 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/41c83c75-cfc8-4c33-97cf-484cc7dcd812-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9\" (UID: \"41c83c75-cfc8-4c33-97cf-484cc7dcd812\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.616823 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.619939 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-lfslk" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.620169 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.646595 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(79672fbdaf8c685ac07766c52286d5c81ef8d69be2e667ad19714265c129754c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.646662 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(79672fbdaf8c685ac07766c52286d5c81ef8d69be2e667ad19714265c129754c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.646688 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(79672fbdaf8c685ac07766c52286d5c81ef8d69be2e667ad19714265c129754c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.646727 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators(30a5c0d6-c773-4914-a3b1-1654a51817a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators(30a5c0d6-c773-4914-a3b1-1654a51817a9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(79672fbdaf8c685ac07766c52286d5c81ef8d69be2e667ad19714265c129754c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" podUID="30a5c0d6-c773-4914-a3b1-1654a51817a9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.708312 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ac0e2d1-4762-4c40-84c9-db0bde4f956f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-b2dk9\" (UID: \"7ac0e2d1-4762-4c40-84c9-db0bde4f956f\") " pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.708656 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8mx5\" (UniqueName: \"kubernetes.io/projected/7ac0e2d1-4762-4c40-84c9-db0bde4f956f-kube-api-access-b8mx5\") pod \"observability-operator-59bdc8b94-b2dk9\" (UID: \"7ac0e2d1-4762-4c40-84c9-db0bde4f956f\") " pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.717227 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.739977 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(7b53255c9c2462398e11314e80045559915817c75fa7fef85957c68092021385): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.740060 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(7b53255c9c2462398e11314e80045559915817c75fa7fef85957c68092021385): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.740107 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(7b53255c9c2462398e11314e80045559915817c75fa7fef85957c68092021385): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.740162 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators(a1f92c36-0e50-485e-a728-7b42f1ab44c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators(a1f92c36-0e50-485e-a728-7b42f1ab44c4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(7b53255c9c2462398e11314e80045559915817c75fa7fef85957c68092021385): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" podUID="a1f92c36-0e50-485e-a728-7b42f1ab44c4" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.743537 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.766018 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(fcbfbf0412edb28b35426e81591ffd922262d9a3013d56745bc3e277e40a1672): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.766104 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(fcbfbf0412edb28b35426e81591ffd922262d9a3013d56745bc3e277e40a1672): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.766131 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(fcbfbf0412edb28b35426e81591ffd922262d9a3013d56745bc3e277e40a1672): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.766182 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators(41c83c75-cfc8-4c33-97cf-484cc7dcd812)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators(41c83c75-cfc8-4c33-97cf-484cc7dcd812)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(fcbfbf0412edb28b35426e81591ffd922262d9a3013d56745bc3e277e40a1672): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" podUID="41c83c75-cfc8-4c33-97cf-484cc7dcd812" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.805015 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerStarted","Data":"07603a2f12ad4b3eec83c311a37dd5f84245f8bd2803504e03e20579f16dcce5"} Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.808940 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-5w6sx"] Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.809706 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.809962 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ac0e2d1-4762-4c40-84c9-db0bde4f956f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-b2dk9\" (UID: \"7ac0e2d1-4762-4c40-84c9-db0bde4f956f\") " pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.810126 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8mx5\" (UniqueName: \"kubernetes.io/projected/7ac0e2d1-4762-4c40-84c9-db0bde4f956f-kube-api-access-b8mx5\") pod \"observability-operator-59bdc8b94-b2dk9\" (UID: \"7ac0e2d1-4762-4c40-84c9-db0bde4f956f\") " pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.814096 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ac0e2d1-4762-4c40-84c9-db0bde4f956f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-b2dk9\" (UID: \"7ac0e2d1-4762-4c40-84c9-db0bde4f956f\") " pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.814389 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6h7r5" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.830743 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8mx5\" (UniqueName: \"kubernetes.io/projected/7ac0e2d1-4762-4c40-84c9-db0bde4f956f-kube-api-access-b8mx5\") pod \"observability-operator-59bdc8b94-b2dk9\" (UID: \"7ac0e2d1-4762-4c40-84c9-db0bde4f956f\") " pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.911350 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3affad2-ab35-4604-8239-56f69bf3727f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-5w6sx\" (UID: \"a3affad2-ab35-4604-8239-56f69bf3727f\") " pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.911396 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksr2c\" (UniqueName: \"kubernetes.io/projected/a3affad2-ab35-4604-8239-56f69bf3727f-kube-api-access-ksr2c\") pod \"perses-operator-5bf474d74f-5w6sx\" (UID: \"a3affad2-ab35-4604-8239-56f69bf3727f\") " pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:26 crc kubenswrapper[4721]: I0202 13:12:26.980134 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.998930 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(3e6cd8595f687626e31bfe27e13ca08eb7499fc327ff2c360e56822c8bee104b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.999026 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(3e6cd8595f687626e31bfe27e13ca08eb7499fc327ff2c360e56822c8bee104b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.999056 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(3e6cd8595f687626e31bfe27e13ca08eb7499fc327ff2c360e56822c8bee104b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:26 crc kubenswrapper[4721]: E0202 13:12:26.999125 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-b2dk9_openshift-operators(7ac0e2d1-4762-4c40-84c9-db0bde4f956f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-b2dk9_openshift-operators(7ac0e2d1-4762-4c40-84c9-db0bde4f956f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(3e6cd8595f687626e31bfe27e13ca08eb7499fc327ff2c360e56822c8bee104b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" podUID="7ac0e2d1-4762-4c40-84c9-db0bde4f956f" Feb 02 13:12:27 crc kubenswrapper[4721]: I0202 13:12:27.012983 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksr2c\" (UniqueName: \"kubernetes.io/projected/a3affad2-ab35-4604-8239-56f69bf3727f-kube-api-access-ksr2c\") pod \"perses-operator-5bf474d74f-5w6sx\" (UID: \"a3affad2-ab35-4604-8239-56f69bf3727f\") " pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:27 crc kubenswrapper[4721]: I0202 13:12:27.013138 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3affad2-ab35-4604-8239-56f69bf3727f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-5w6sx\" (UID: \"a3affad2-ab35-4604-8239-56f69bf3727f\") " pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:27 crc kubenswrapper[4721]: I0202 13:12:27.014131 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a3affad2-ab35-4604-8239-56f69bf3727f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-5w6sx\" (UID: \"a3affad2-ab35-4604-8239-56f69bf3727f\") " pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:27 crc kubenswrapper[4721]: I0202 13:12:27.037423 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksr2c\" (UniqueName: \"kubernetes.io/projected/a3affad2-ab35-4604-8239-56f69bf3727f-kube-api-access-ksr2c\") pod \"perses-operator-5bf474d74f-5w6sx\" (UID: \"a3affad2-ab35-4604-8239-56f69bf3727f\") " pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:27 crc kubenswrapper[4721]: I0202 13:12:27.175751 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:27 crc kubenswrapper[4721]: E0202 13:12:27.206561 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(559b440f5cba3e7d109837dbc854906999eca986a89a4fb86a01200233f035ec): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:27 crc kubenswrapper[4721]: E0202 13:12:27.206644 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(559b440f5cba3e7d109837dbc854906999eca986a89a4fb86a01200233f035ec): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:27 crc kubenswrapper[4721]: E0202 13:12:27.206664 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(559b440f5cba3e7d109837dbc854906999eca986a89a4fb86a01200233f035ec): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:27 crc kubenswrapper[4721]: E0202 13:12:27.206709 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-5w6sx_openshift-operators(a3affad2-ab35-4604-8239-56f69bf3727f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-5w6sx_openshift-operators(a3affad2-ab35-4604-8239-56f69bf3727f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(559b440f5cba3e7d109837dbc854906999eca986a89a4fb86a01200233f035ec): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" podUID="a3affad2-ab35-4604-8239-56f69bf3727f" Feb 02 13:12:28 crc kubenswrapper[4721]: I0202 13:12:28.823616 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" event={"ID":"1733f0ca-4783-4099-b66d-b497993def10","Type":"ContainerStarted","Data":"a0d41756427f2961f838ea426ab5e3b3ff7ceafb5de64c0df4fd9d48d8c17bc0"} Feb 02 13:12:28 crc kubenswrapper[4721]: I0202 13:12:28.824341 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:28 crc kubenswrapper[4721]: I0202 13:12:28.824378 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:28 crc kubenswrapper[4721]: I0202 13:12:28.824423 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:28 crc kubenswrapper[4721]: I0202 13:12:28.852293 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:28 crc kubenswrapper[4721]: I0202 13:12:28.856361 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:28 crc kubenswrapper[4721]: I0202 13:12:28.873760 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" podStartSLOduration=7.873744889 podStartE2EDuration="7.873744889s" podCreationTimestamp="2026-02-02 13:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:12:28.872339692 +0000 UTC m=+689.174854101" watchObservedRunningTime="2026-02-02 13:12:28.873744889 +0000 UTC m=+689.176259278" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.359302 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-b2dk9"] Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.359414 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.359815 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.370048 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt"] Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.370361 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.370938 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.374489 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-5w6sx"] Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.374775 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.375394 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.378698 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx"] Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.378844 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.379437 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.394137 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9"] Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.394261 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:29 crc kubenswrapper[4721]: I0202 13:12:29.394809 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.433628 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(e13cadae6e67404dc5cc138b05d1a002778f8c87b9f279d8f66c027a11e6b233): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.433991 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(e13cadae6e67404dc5cc138b05d1a002778f8c87b9f279d8f66c027a11e6b233): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.434021 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(e13cadae6e67404dc5cc138b05d1a002778f8c87b9f279d8f66c027a11e6b233): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.434094 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-b2dk9_openshift-operators(7ac0e2d1-4762-4c40-84c9-db0bde4f956f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-b2dk9_openshift-operators(7ac0e2d1-4762-4c40-84c9-db0bde4f956f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(e13cadae6e67404dc5cc138b05d1a002778f8c87b9f279d8f66c027a11e6b233): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" podUID="7ac0e2d1-4762-4c40-84c9-db0bde4f956f" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.445936 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(744565d13fa0b7ab4ffc378ca0de13e51c331edae3ce5a6bc266a3e354e1199d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.445998 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(744565d13fa0b7ab4ffc378ca0de13e51c331edae3ce5a6bc266a3e354e1199d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.446018 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(744565d13fa0b7ab4ffc378ca0de13e51c331edae3ce5a6bc266a3e354e1199d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.446107 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators(30a5c0d6-c773-4914-a3b1-1654a51817a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators(30a5c0d6-c773-4914-a3b1-1654a51817a9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(744565d13fa0b7ab4ffc378ca0de13e51c331edae3ce5a6bc266a3e354e1199d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" podUID="30a5c0d6-c773-4914-a3b1-1654a51817a9" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.477463 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(31249dcddf3c0fc57945822e001a973e64dee694abb2cf2ab341207103ac9f47): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.477537 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(31249dcddf3c0fc57945822e001a973e64dee694abb2cf2ab341207103ac9f47): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.477557 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(31249dcddf3c0fc57945822e001a973e64dee694abb2cf2ab341207103ac9f47): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.477608 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators(a1f92c36-0e50-485e-a728-7b42f1ab44c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators(a1f92c36-0e50-485e-a728-7b42f1ab44c4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(31249dcddf3c0fc57945822e001a973e64dee694abb2cf2ab341207103ac9f47): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" podUID="a1f92c36-0e50-485e-a728-7b42f1ab44c4" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.485850 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(8d132a561b69802e29a8ee38347c502a9fd189fae8c8a62b003c376c63023b43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.485928 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(8d132a561b69802e29a8ee38347c502a9fd189fae8c8a62b003c376c63023b43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.485952 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(8d132a561b69802e29a8ee38347c502a9fd189fae8c8a62b003c376c63023b43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.486009 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-5w6sx_openshift-operators(a3affad2-ab35-4604-8239-56f69bf3727f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-5w6sx_openshift-operators(a3affad2-ab35-4604-8239-56f69bf3727f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(8d132a561b69802e29a8ee38347c502a9fd189fae8c8a62b003c376c63023b43): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" podUID="a3affad2-ab35-4604-8239-56f69bf3727f" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.498951 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(925a509b9c0ad7623eb20fdd705a0eab537b0e01d6f3709632c58d5ca1f0b964): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.499010 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(925a509b9c0ad7623eb20fdd705a0eab537b0e01d6f3709632c58d5ca1f0b964): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.499030 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(925a509b9c0ad7623eb20fdd705a0eab537b0e01d6f3709632c58d5ca1f0b964): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:29 crc kubenswrapper[4721]: E0202 13:12:29.499089 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators(41c83c75-cfc8-4c33-97cf-484cc7dcd812)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators(41c83c75-cfc8-4c33-97cf-484cc7dcd812)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(925a509b9c0ad7623eb20fdd705a0eab537b0e01d6f3709632c58d5ca1f0b964): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" podUID="41c83c75-cfc8-4c33-97cf-484cc7dcd812" Feb 02 13:12:35 crc kubenswrapper[4721]: I0202 13:12:35.409290 4721 scope.go:117] "RemoveContainer" containerID="c98892a7ff179bcba871f45746b3c85a83090c01da93d13aeaeb2a282472689d" Feb 02 13:12:35 crc kubenswrapper[4721]: E0202 13:12:35.409894 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ltw7d_openshift-multus(5ba84858-caaa-4fba-8eaf-9f7ddece0b3a)\"" pod="openshift-multus/multus-ltw7d" podUID="5ba84858-caaa-4fba-8eaf-9f7ddece0b3a" Feb 02 13:12:41 crc kubenswrapper[4721]: I0202 13:12:41.409587 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:41 crc kubenswrapper[4721]: I0202 13:12:41.410861 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:41 crc kubenswrapper[4721]: E0202 13:12:41.437084 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(899f8c0ce7a06f9795d4a4d32db0b45716345b6ca2a00aa1464994e8e1344961): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:41 crc kubenswrapper[4721]: E0202 13:12:41.437151 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(899f8c0ce7a06f9795d4a4d32db0b45716345b6ca2a00aa1464994e8e1344961): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:41 crc kubenswrapper[4721]: E0202 13:12:41.437173 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(899f8c0ce7a06f9795d4a4d32db0b45716345b6ca2a00aa1464994e8e1344961): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:41 crc kubenswrapper[4721]: E0202 13:12:41.437220 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators(30a5c0d6-c773-4914-a3b1-1654a51817a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators(30a5c0d6-c773-4914-a3b1-1654a51817a9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-jltbt_openshift-operators_30a5c0d6-c773-4914-a3b1-1654a51817a9_0(899f8c0ce7a06f9795d4a4d32db0b45716345b6ca2a00aa1464994e8e1344961): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" podUID="30a5c0d6-c773-4914-a3b1-1654a51817a9" Feb 02 13:12:42 crc kubenswrapper[4721]: I0202 13:12:42.408980 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:42 crc kubenswrapper[4721]: I0202 13:12:42.409259 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:42 crc kubenswrapper[4721]: I0202 13:12:42.411927 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:42 crc kubenswrapper[4721]: I0202 13:12:42.415162 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:42 crc kubenswrapper[4721]: E0202 13:12:42.452360 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(45c55302ef56cadb0d8b014e10cc03381dacc7de88d6435b11300183368c7e46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:42 crc kubenswrapper[4721]: E0202 13:12:42.452444 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(45c55302ef56cadb0d8b014e10cc03381dacc7de88d6435b11300183368c7e46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:42 crc kubenswrapper[4721]: E0202 13:12:42.452465 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(45c55302ef56cadb0d8b014e10cc03381dacc7de88d6435b11300183368c7e46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:42 crc kubenswrapper[4721]: E0202 13:12:42.452544 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators(41c83c75-cfc8-4c33-97cf-484cc7dcd812)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators(41c83c75-cfc8-4c33-97cf-484cc7dcd812)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_openshift-operators_41c83c75-cfc8-4c33-97cf-484cc7dcd812_0(45c55302ef56cadb0d8b014e10cc03381dacc7de88d6435b11300183368c7e46): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" podUID="41c83c75-cfc8-4c33-97cf-484cc7dcd812" Feb 02 13:12:42 crc kubenswrapper[4721]: E0202 13:12:42.462334 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(8d3a4c19a1c7ed3a22d9a1c5e20de30186a33a2375386e695d2e2a32068b427b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:42 crc kubenswrapper[4721]: E0202 13:12:42.462407 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(8d3a4c19a1c7ed3a22d9a1c5e20de30186a33a2375386e695d2e2a32068b427b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:42 crc kubenswrapper[4721]: E0202 13:12:42.462435 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(8d3a4c19a1c7ed3a22d9a1c5e20de30186a33a2375386e695d2e2a32068b427b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:42 crc kubenswrapper[4721]: E0202 13:12:42.462485 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-5w6sx_openshift-operators(a3affad2-ab35-4604-8239-56f69bf3727f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-5w6sx_openshift-operators(a3affad2-ab35-4604-8239-56f69bf3727f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-5w6sx_openshift-operators_a3affad2-ab35-4604-8239-56f69bf3727f_0(8d3a4c19a1c7ed3a22d9a1c5e20de30186a33a2375386e695d2e2a32068b427b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" podUID="a3affad2-ab35-4604-8239-56f69bf3727f" Feb 02 13:12:44 crc kubenswrapper[4721]: I0202 13:12:44.409309 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:44 crc kubenswrapper[4721]: I0202 13:12:44.409326 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:44 crc kubenswrapper[4721]: I0202 13:12:44.410184 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:44 crc kubenswrapper[4721]: I0202 13:12:44.410398 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:44 crc kubenswrapper[4721]: E0202 13:12:44.462834 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(0611a943d53f331cf56689346926abfb020aa089e33ec02826f61b3548f26290): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:44 crc kubenswrapper[4721]: E0202 13:12:44.462911 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(0611a943d53f331cf56689346926abfb020aa089e33ec02826f61b3548f26290): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:44 crc kubenswrapper[4721]: E0202 13:12:44.462932 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(0611a943d53f331cf56689346926abfb020aa089e33ec02826f61b3548f26290): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:44 crc kubenswrapper[4721]: E0202 13:12:44.462977 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators(a1f92c36-0e50-485e-a728-7b42f1ab44c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators(a1f92c36-0e50-485e-a728-7b42f1ab44c4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_openshift-operators_a1f92c36-0e50-485e-a728-7b42f1ab44c4_0(0611a943d53f331cf56689346926abfb020aa089e33ec02826f61b3548f26290): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" podUID="a1f92c36-0e50-485e-a728-7b42f1ab44c4" Feb 02 13:12:44 crc kubenswrapper[4721]: E0202 13:12:44.467311 4721 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(40000d12aa3de8bcf4ce996ce73297b583c46039f3b66517b02b6fc3487e9d31): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 13:12:44 crc kubenswrapper[4721]: E0202 13:12:44.467357 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(40000d12aa3de8bcf4ce996ce73297b583c46039f3b66517b02b6fc3487e9d31): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:44 crc kubenswrapper[4721]: E0202 13:12:44.467376 4721 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(40000d12aa3de8bcf4ce996ce73297b583c46039f3b66517b02b6fc3487e9d31): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:44 crc kubenswrapper[4721]: E0202 13:12:44.467412 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-b2dk9_openshift-operators(7ac0e2d1-4762-4c40-84c9-db0bde4f956f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-b2dk9_openshift-operators(7ac0e2d1-4762-4c40-84c9-db0bde4f956f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-b2dk9_openshift-operators_7ac0e2d1-4762-4c40-84c9-db0bde4f956f_0(40000d12aa3de8bcf4ce996ce73297b583c46039f3b66517b02b6fc3487e9d31): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" podUID="7ac0e2d1-4762-4c40-84c9-db0bde4f956f" Feb 02 13:12:48 crc kubenswrapper[4721]: I0202 13:12:48.409932 4721 scope.go:117] "RemoveContainer" containerID="c98892a7ff179bcba871f45746b3c85a83090c01da93d13aeaeb2a282472689d" Feb 02 13:12:48 crc kubenswrapper[4721]: I0202 13:12:48.928972 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltw7d_5ba84858-caaa-4fba-8eaf-9f7ddece0b3a/kube-multus/2.log" Feb 02 13:12:48 crc kubenswrapper[4721]: I0202 13:12:48.929330 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltw7d" event={"ID":"5ba84858-caaa-4fba-8eaf-9f7ddece0b3a","Type":"ContainerStarted","Data":"2ac6b70129b83205e808ed8ecdac9b06fb0b8752b06e7251666022066e956189"} Feb 02 13:12:51 crc kubenswrapper[4721]: I0202 13:12:51.742423 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8dtnt" Feb 02 13:12:52 crc kubenswrapper[4721]: I0202 13:12:52.409617 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:52 crc kubenswrapper[4721]: I0202 13:12:52.410046 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" Feb 02 13:12:52 crc kubenswrapper[4721]: I0202 13:12:52.787722 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt"] Feb 02 13:12:52 crc kubenswrapper[4721]: W0202 13:12:52.795256 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a5c0d6_c773_4914_a3b1_1654a51817a9.slice/crio-2f9c4789155105e7d051b78b863e88c79803180d46590922638146e7c812eb92 WatchSource:0}: Error finding container 2f9c4789155105e7d051b78b863e88c79803180d46590922638146e7c812eb92: Status 404 returned error can't find the container with id 2f9c4789155105e7d051b78b863e88c79803180d46590922638146e7c812eb92 Feb 02 13:12:52 crc kubenswrapper[4721]: I0202 13:12:52.958385 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" event={"ID":"30a5c0d6-c773-4914-a3b1-1654a51817a9","Type":"ContainerStarted","Data":"2f9c4789155105e7d051b78b863e88c79803180d46590922638146e7c812eb92"} Feb 02 13:12:54 crc kubenswrapper[4721]: I0202 13:12:54.409174 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:54 crc kubenswrapper[4721]: I0202 13:12:54.410256 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:54 crc kubenswrapper[4721]: I0202 13:12:54.601376 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-5w6sx"] Feb 02 13:12:54 crc kubenswrapper[4721]: I0202 13:12:54.968852 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" event={"ID":"a3affad2-ab35-4604-8239-56f69bf3727f","Type":"ContainerStarted","Data":"41cd1333279decf96c22587391e7658b7932c4b5392a3c08ce0322d97d840237"} Feb 02 13:12:56 crc kubenswrapper[4721]: I0202 13:12:56.409755 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:56 crc kubenswrapper[4721]: I0202 13:12:56.410610 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:12:57 crc kubenswrapper[4721]: I0202 13:12:57.409276 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:57 crc kubenswrapper[4721]: I0202 13:12:57.409781 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" Feb 02 13:12:58 crc kubenswrapper[4721]: I0202 13:12:58.983160 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-b2dk9"] Feb 02 13:12:58 crc kubenswrapper[4721]: W0202 13:12:58.986935 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ac0e2d1_4762_4c40_84c9_db0bde4f956f.slice/crio-4dba87266c45f42bb89bafe5ce1d1c90370220ef5d61cb2348e210117f8e1d61 WatchSource:0}: Error finding container 4dba87266c45f42bb89bafe5ce1d1c90370220ef5d61cb2348e210117f8e1d61: Status 404 returned error can't find the container with id 4dba87266c45f42bb89bafe5ce1d1c90370220ef5d61cb2348e210117f8e1d61 Feb 02 13:12:58 crc kubenswrapper[4721]: I0202 13:12:58.998369 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" event={"ID":"7ac0e2d1-4762-4c40-84c9-db0bde4f956f","Type":"ContainerStarted","Data":"4dba87266c45f42bb89bafe5ce1d1c90370220ef5d61cb2348e210117f8e1d61"} Feb 02 13:12:59 crc kubenswrapper[4721]: I0202 13:12:59.000720 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" event={"ID":"a3affad2-ab35-4604-8239-56f69bf3727f","Type":"ContainerStarted","Data":"11b8000d94074919c4278e3638fecaea14a5fdd31740eb1c45cae3b16a9bb50f"} Feb 02 13:12:59 crc kubenswrapper[4721]: I0202 13:12:59.000819 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:12:59 crc kubenswrapper[4721]: I0202 13:12:59.025741 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" podStartSLOduration=28.845120914 podStartE2EDuration="33.025711675s" podCreationTimestamp="2026-02-02 13:12:26 +0000 UTC" firstStartedPulling="2026-02-02 13:12:54.621529743 +0000 UTC m=+714.924044132" lastFinishedPulling="2026-02-02 13:12:58.802120504 +0000 UTC m=+719.104634893" observedRunningTime="2026-02-02 13:12:59.018935686 +0000 UTC m=+719.321450095" watchObservedRunningTime="2026-02-02 13:12:59.025711675 +0000 UTC m=+719.328226064" Feb 02 13:12:59 crc kubenswrapper[4721]: I0202 13:12:59.044221 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9"] Feb 02 13:12:59 crc kubenswrapper[4721]: W0202 13:12:59.050385 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c83c75_cfc8_4c33_97cf_484cc7dcd812.slice/crio-1b2dc7067a50e7f3818fcaca5753f83828b37c0c088154ab826e4b03158e2ed0 WatchSource:0}: Error finding container 1b2dc7067a50e7f3818fcaca5753f83828b37c0c088154ab826e4b03158e2ed0: Status 404 returned error can't find the container with id 1b2dc7067a50e7f3818fcaca5753f83828b37c0c088154ab826e4b03158e2ed0 Feb 02 13:12:59 crc kubenswrapper[4721]: I0202 13:12:59.409145 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:59 crc kubenswrapper[4721]: I0202 13:12:59.409678 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" Feb 02 13:12:59 crc kubenswrapper[4721]: I0202 13:12:59.812038 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx"] Feb 02 13:12:59 crc kubenswrapper[4721]: W0202 13:12:59.821340 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1f92c36_0e50_485e_a728_7b42f1ab44c4.slice/crio-5ec6a4c877d359b7d81f48561e68abb494f6d6c9455548b3b8e86b701d39f701 WatchSource:0}: Error finding container 5ec6a4c877d359b7d81f48561e68abb494f6d6c9455548b3b8e86b701d39f701: Status 404 returned error can't find the container with id 5ec6a4c877d359b7d81f48561e68abb494f6d6c9455548b3b8e86b701d39f701 Feb 02 13:13:00 crc kubenswrapper[4721]: I0202 13:13:00.009024 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" event={"ID":"a1f92c36-0e50-485e-a728-7b42f1ab44c4","Type":"ContainerStarted","Data":"5ec6a4c877d359b7d81f48561e68abb494f6d6c9455548b3b8e86b701d39f701"} Feb 02 13:13:00 crc kubenswrapper[4721]: I0202 13:13:00.011958 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" event={"ID":"30a5c0d6-c773-4914-a3b1-1654a51817a9","Type":"ContainerStarted","Data":"b1d9f5aff26b9d078c5af4fbf2ee4d1cccb2f70e4ec53d06468a84b11e2b7df9"} Feb 02 13:13:00 crc kubenswrapper[4721]: I0202 13:13:00.015381 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" event={"ID":"41c83c75-cfc8-4c33-97cf-484cc7dcd812","Type":"ContainerStarted","Data":"1b2dc7067a50e7f3818fcaca5753f83828b37c0c088154ab826e4b03158e2ed0"} Feb 02 13:13:00 crc kubenswrapper[4721]: I0202 13:13:00.036740 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-jltbt" podStartSLOduration=28.014187723 podStartE2EDuration="34.036718476s" podCreationTimestamp="2026-02-02 13:12:26 +0000 UTC" firstStartedPulling="2026-02-02 13:12:52.797034962 +0000 UTC m=+713.099549351" lastFinishedPulling="2026-02-02 13:12:58.819565715 +0000 UTC m=+719.122080104" observedRunningTime="2026-02-02 13:13:00.032831704 +0000 UTC m=+720.335346093" watchObservedRunningTime="2026-02-02 13:13:00.036718476 +0000 UTC m=+720.339232875" Feb 02 13:13:03 crc kubenswrapper[4721]: I0202 13:13:03.038461 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" event={"ID":"a1f92c36-0e50-485e-a728-7b42f1ab44c4","Type":"ContainerStarted","Data":"929a4b173449b94b30dab348faef189ffae53eab9fd6d0e48174ad4c4f8417a5"} Feb 02 13:13:03 crc kubenswrapper[4721]: I0202 13:13:03.041384 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" event={"ID":"41c83c75-cfc8-4c33-97cf-484cc7dcd812","Type":"ContainerStarted","Data":"aad0efe57936a1531bdf3042e0f38c209b48ecbdfc3a30e837d7b60f96ab6d9b"} Feb 02 13:13:03 crc kubenswrapper[4721]: I0202 13:13:03.061124 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx" podStartSLOduration=34.963326261 podStartE2EDuration="37.061096713s" podCreationTimestamp="2026-02-02 13:12:26 +0000 UTC" firstStartedPulling="2026-02-02 13:12:59.823539201 +0000 UTC m=+720.126053600" lastFinishedPulling="2026-02-02 13:13:01.921309663 +0000 UTC m=+722.223824052" observedRunningTime="2026-02-02 13:13:03.053574835 +0000 UTC m=+723.356089234" watchObservedRunningTime="2026-02-02 13:13:03.061096713 +0000 UTC m=+723.363611112" Feb 02 13:13:03 crc kubenswrapper[4721]: I0202 13:13:03.100921 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9" podStartSLOduration=34.228947799 podStartE2EDuration="37.100896893s" podCreationTimestamp="2026-02-02 13:12:26 +0000 UTC" firstStartedPulling="2026-02-02 13:12:59.053347914 +0000 UTC m=+719.355862303" lastFinishedPulling="2026-02-02 13:13:01.925297008 +0000 UTC m=+722.227811397" observedRunningTime="2026-02-02 13:13:03.090935911 +0000 UTC m=+723.393450300" watchObservedRunningTime="2026-02-02 13:13:03.100896893 +0000 UTC m=+723.403411282" Feb 02 13:13:06 crc kubenswrapper[4721]: I0202 13:13:06.060298 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" event={"ID":"7ac0e2d1-4762-4c40-84c9-db0bde4f956f","Type":"ContainerStarted","Data":"584beccc490533b60eb3b45ff37e098ffc0a4a9261ddf27e505ca1c196da960a"} Feb 02 13:13:06 crc kubenswrapper[4721]: I0202 13:13:06.060936 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:13:06 crc kubenswrapper[4721]: I0202 13:13:06.067279 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" Feb 02 13:13:06 crc kubenswrapper[4721]: I0202 13:13:06.086788 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-b2dk9" podStartSLOduration=34.050964233 podStartE2EDuration="40.086766055s" podCreationTimestamp="2026-02-02 13:12:26 +0000 UTC" firstStartedPulling="2026-02-02 13:12:58.989897779 +0000 UTC m=+719.292412168" lastFinishedPulling="2026-02-02 13:13:05.025699601 +0000 UTC m=+725.328213990" observedRunningTime="2026-02-02 13:13:06.085280776 +0000 UTC m=+726.387795165" watchObservedRunningTime="2026-02-02 13:13:06.086766055 +0000 UTC m=+726.389280444" Feb 02 13:13:07 crc kubenswrapper[4721]: I0202 13:13:07.179725 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-5w6sx" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.522024 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr"] Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.524605 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.530637 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.530840 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.530973 4721 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-6xznh" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.535212 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr"] Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.543550 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-qxmnx"] Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.544649 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-qxmnx" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.547440 4721 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-tcrxr" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.559900 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-qxmnx"] Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.585307 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xjhrq"] Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.586264 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.588264 4721 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-ng4mz" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.596121 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv7cf\" (UniqueName: \"kubernetes.io/projected/caf3dcb7-c58d-4d36-9329-c9b8d3c354a8-kube-api-access-tv7cf\") pod \"cert-manager-858654f9db-qxmnx\" (UID: \"caf3dcb7-c58d-4d36-9329-c9b8d3c354a8\") " pod="cert-manager/cert-manager-858654f9db-qxmnx" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.596297 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xjhrq"] Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.596484 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km82n\" (UniqueName: \"kubernetes.io/projected/988d3eab-804d-4db0-8855-b63ebbeabce4-kube-api-access-km82n\") pod \"cert-manager-cainjector-cf98fcc89-k5vmr\" (UID: \"988d3eab-804d-4db0-8855-b63ebbeabce4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.697835 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv7cf\" (UniqueName: \"kubernetes.io/projected/caf3dcb7-c58d-4d36-9329-c9b8d3c354a8-kube-api-access-tv7cf\") pod \"cert-manager-858654f9db-qxmnx\" (UID: \"caf3dcb7-c58d-4d36-9329-c9b8d3c354a8\") " pod="cert-manager/cert-manager-858654f9db-qxmnx" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.697913 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjjlf\" (UniqueName: \"kubernetes.io/projected/aff1475e-36c5-471a-b04e-01cefc2d2763-kube-api-access-rjjlf\") pod \"cert-manager-webhook-687f57d79b-xjhrq\" (UID: \"aff1475e-36c5-471a-b04e-01cefc2d2763\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.697969 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km82n\" (UniqueName: \"kubernetes.io/projected/988d3eab-804d-4db0-8855-b63ebbeabce4-kube-api-access-km82n\") pod \"cert-manager-cainjector-cf98fcc89-k5vmr\" (UID: \"988d3eab-804d-4db0-8855-b63ebbeabce4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.716140 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km82n\" (UniqueName: \"kubernetes.io/projected/988d3eab-804d-4db0-8855-b63ebbeabce4-kube-api-access-km82n\") pod \"cert-manager-cainjector-cf98fcc89-k5vmr\" (UID: \"988d3eab-804d-4db0-8855-b63ebbeabce4\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.716849 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv7cf\" (UniqueName: \"kubernetes.io/projected/caf3dcb7-c58d-4d36-9329-c9b8d3c354a8-kube-api-access-tv7cf\") pod \"cert-manager-858654f9db-qxmnx\" (UID: \"caf3dcb7-c58d-4d36-9329-c9b8d3c354a8\") " pod="cert-manager/cert-manager-858654f9db-qxmnx" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.799045 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjjlf\" (UniqueName: \"kubernetes.io/projected/aff1475e-36c5-471a-b04e-01cefc2d2763-kube-api-access-rjjlf\") pod \"cert-manager-webhook-687f57d79b-xjhrq\" (UID: \"aff1475e-36c5-471a-b04e-01cefc2d2763\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.815543 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjjlf\" (UniqueName: \"kubernetes.io/projected/aff1475e-36c5-471a-b04e-01cefc2d2763-kube-api-access-rjjlf\") pod \"cert-manager-webhook-687f57d79b-xjhrq\" (UID: \"aff1475e-36c5-471a-b04e-01cefc2d2763\") " pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.843144 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.861368 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-qxmnx" Feb 02 13:13:13 crc kubenswrapper[4721]: I0202 13:13:13.901596 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" Feb 02 13:13:14 crc kubenswrapper[4721]: I0202 13:13:14.325909 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr"] Feb 02 13:13:14 crc kubenswrapper[4721]: I0202 13:13:14.390110 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-xjhrq"] Feb 02 13:13:14 crc kubenswrapper[4721]: W0202 13:13:14.390932 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaff1475e_36c5_471a_b04e_01cefc2d2763.slice/crio-949ec2ec85c9aeb5fc90535cc0e3929a004fa0cdd739d85863eb24d30e78dee0 WatchSource:0}: Error finding container 949ec2ec85c9aeb5fc90535cc0e3929a004fa0cdd739d85863eb24d30e78dee0: Status 404 returned error can't find the container with id 949ec2ec85c9aeb5fc90535cc0e3929a004fa0cdd739d85863eb24d30e78dee0 Feb 02 13:13:14 crc kubenswrapper[4721]: W0202 13:13:14.422293 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaf3dcb7_c58d_4d36_9329_c9b8d3c354a8.slice/crio-a786d4cdab57400c86d985b797499e7d1b54c4388c492e8f1ee09ec1091cf5d2 WatchSource:0}: Error finding container a786d4cdab57400c86d985b797499e7d1b54c4388c492e8f1ee09ec1091cf5d2: Status 404 returned error can't find the container with id a786d4cdab57400c86d985b797499e7d1b54c4388c492e8f1ee09ec1091cf5d2 Feb 02 13:13:14 crc kubenswrapper[4721]: I0202 13:13:14.446920 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-qxmnx"] Feb 02 13:13:15 crc kubenswrapper[4721]: I0202 13:13:15.113560 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" event={"ID":"aff1475e-36c5-471a-b04e-01cefc2d2763","Type":"ContainerStarted","Data":"949ec2ec85c9aeb5fc90535cc0e3929a004fa0cdd739d85863eb24d30e78dee0"} Feb 02 13:13:15 crc kubenswrapper[4721]: I0202 13:13:15.115220 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-qxmnx" event={"ID":"caf3dcb7-c58d-4d36-9329-c9b8d3c354a8","Type":"ContainerStarted","Data":"a786d4cdab57400c86d985b797499e7d1b54c4388c492e8f1ee09ec1091cf5d2"} Feb 02 13:13:15 crc kubenswrapper[4721]: I0202 13:13:15.116332 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr" event={"ID":"988d3eab-804d-4db0-8855-b63ebbeabce4","Type":"ContainerStarted","Data":"21844476f36dfaca63d9ed98ea9c80e9eb9f93d3b4829b98fa17459c28d6bef7"} Feb 02 13:13:19 crc kubenswrapper[4721]: I0202 13:13:19.140737 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" event={"ID":"aff1475e-36c5-471a-b04e-01cefc2d2763","Type":"ContainerStarted","Data":"4fc152b4b8f486654322c2ba4bfd1ebfc4890305d331d879a859186db0a95657"} Feb 02 13:13:19 crc kubenswrapper[4721]: I0202 13:13:19.141782 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" Feb 02 13:13:19 crc kubenswrapper[4721]: I0202 13:13:19.142945 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-qxmnx" event={"ID":"caf3dcb7-c58d-4d36-9329-c9b8d3c354a8","Type":"ContainerStarted","Data":"af4951d27df17a0c60d26dc17558c3dee39a008d040fd3a78623db0ab51f9626"} Feb 02 13:13:19 crc kubenswrapper[4721]: I0202 13:13:19.145538 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr" event={"ID":"988d3eab-804d-4db0-8855-b63ebbeabce4","Type":"ContainerStarted","Data":"3bdbbf9b1e755b63af56cb73e4c71330c2abb937996e7c8c2d39a0720db91488"} Feb 02 13:13:19 crc kubenswrapper[4721]: I0202 13:13:19.161953 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" podStartSLOduration=1.705942228 podStartE2EDuration="6.161927736s" podCreationTimestamp="2026-02-02 13:13:13 +0000 UTC" firstStartedPulling="2026-02-02 13:13:14.39759024 +0000 UTC m=+734.700104629" lastFinishedPulling="2026-02-02 13:13:18.853575748 +0000 UTC m=+739.156090137" observedRunningTime="2026-02-02 13:13:19.155967049 +0000 UTC m=+739.458481468" watchObservedRunningTime="2026-02-02 13:13:19.161927736 +0000 UTC m=+739.464442145" Feb 02 13:13:19 crc kubenswrapper[4721]: I0202 13:13:19.173444 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-qxmnx" podStartSLOduration=1.690537812 podStartE2EDuration="6.17342287s" podCreationTimestamp="2026-02-02 13:13:13 +0000 UTC" firstStartedPulling="2026-02-02 13:13:14.424403007 +0000 UTC m=+734.726917386" lastFinishedPulling="2026-02-02 13:13:18.907288055 +0000 UTC m=+739.209802444" observedRunningTime="2026-02-02 13:13:19.170393099 +0000 UTC m=+739.472907488" watchObservedRunningTime="2026-02-02 13:13:19.17342287 +0000 UTC m=+739.475937279" Feb 02 13:13:28 crc kubenswrapper[4721]: I0202 13:13:28.905194 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-xjhrq" Feb 02 13:13:28 crc kubenswrapper[4721]: I0202 13:13:28.923610 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k5vmr" podStartSLOduration=11.34898778 podStartE2EDuration="15.923588959s" podCreationTimestamp="2026-02-02 13:13:13 +0000 UTC" firstStartedPulling="2026-02-02 13:13:14.339055625 +0000 UTC m=+734.641570014" lastFinishedPulling="2026-02-02 13:13:18.913656804 +0000 UTC m=+739.216171193" observedRunningTime="2026-02-02 13:13:19.199619771 +0000 UTC m=+739.502134160" watchObservedRunningTime="2026-02-02 13:13:28.923588959 +0000 UTC m=+749.226103358" Feb 02 13:13:45 crc kubenswrapper[4721]: I0202 13:13:45.446004 4721 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.335057 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ngvnj"] Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.343443 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.379480 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngvnj"] Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.414678 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-catalog-content\") pod \"community-operators-ngvnj\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.415046 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsw8s\" (UniqueName: \"kubernetes.io/projected/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-kube-api-access-fsw8s\") pod \"community-operators-ngvnj\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.415197 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-utilities\") pod \"community-operators-ngvnj\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.516350 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsw8s\" (UniqueName: \"kubernetes.io/projected/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-kube-api-access-fsw8s\") pod \"community-operators-ngvnj\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.517555 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-utilities\") pod \"community-operators-ngvnj\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.522236 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-catalog-content\") pod \"community-operators-ngvnj\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.522870 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-catalog-content\") pod \"community-operators-ngvnj\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.520863 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-utilities\") pod \"community-operators-ngvnj\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.544409 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsw8s\" (UniqueName: \"kubernetes.io/projected/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-kube-api-access-fsw8s\") pod \"community-operators-ngvnj\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.693998 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:13:53 crc kubenswrapper[4721]: I0202 13:13:53.938590 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ngvnj"] Feb 02 13:13:54 crc kubenswrapper[4721]: I0202 13:13:54.370116 4721 generic.go:334] "Generic (PLEG): container finished" podID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerID="992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d" exitCode=0 Feb 02 13:13:54 crc kubenswrapper[4721]: I0202 13:13:54.370162 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvnj" event={"ID":"e4bc9245-cc92-4fa8-a195-74b2c1fa3018","Type":"ContainerDied","Data":"992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d"} Feb 02 13:13:54 crc kubenswrapper[4721]: I0202 13:13:54.370193 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvnj" event={"ID":"e4bc9245-cc92-4fa8-a195-74b2c1fa3018","Type":"ContainerStarted","Data":"8c61895cb6dd6e4aec63d8237d91407d9d43fbbfa704efe9db664fadf6487376"} Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.348150 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n"] Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.349353 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.351305 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.378331 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvnj" event={"ID":"e4bc9245-cc92-4fa8-a195-74b2c1fa3018","Type":"ContainerStarted","Data":"56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe"} Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.395574 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n"] Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.549630 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.549733 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfcfv\" (UniqueName: \"kubernetes.io/projected/4b23cf05-2074-4e51-b6ef-235b207d8b16-kube-api-access-dfcfv\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.549802 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.651249 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.651315 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfcfv\" (UniqueName: \"kubernetes.io/projected/4b23cf05-2074-4e51-b6ef-235b207d8b16-kube-api-access-dfcfv\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.651364 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.651771 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.651789 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.671004 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfcfv\" (UniqueName: \"kubernetes.io/projected/4b23cf05-2074-4e51-b6ef-235b207d8b16-kube-api-access-dfcfv\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.766984 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7"] Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.768891 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.783630 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7"] Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.955887 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.955948 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzlks\" (UniqueName: \"kubernetes.io/projected/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-kube-api-access-hzlks\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.955971 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:55 crc kubenswrapper[4721]: I0202 13:13:55.963893 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.057003 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.057177 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.057229 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzlks\" (UniqueName: \"kubernetes.io/projected/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-kube-api-access-hzlks\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.058829 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.058861 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.074903 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzlks\" (UniqueName: \"kubernetes.io/projected/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-kube-api-access-hzlks\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.094808 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.163174 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n"] Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.393175 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" event={"ID":"4b23cf05-2074-4e51-b6ef-235b207d8b16","Type":"ContainerStarted","Data":"70890d5b395f40fb527c5817ac958b85be98e7537656d46714bf54f470595c89"} Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.398806 4721 generic.go:334] "Generic (PLEG): container finished" podID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerID="56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe" exitCode=0 Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.398851 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvnj" event={"ID":"e4bc9245-cc92-4fa8-a195-74b2c1fa3018","Type":"ContainerDied","Data":"56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe"} Feb 02 13:13:56 crc kubenswrapper[4721]: I0202 13:13:56.562832 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7"] Feb 02 13:13:56 crc kubenswrapper[4721]: W0202 13:13:56.570144 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53fd5f54_e7fe_4d86_a5b7_3583e945fff3.slice/crio-74c7d987997096b66b94896c6c743a531ed69cc553ca19de86a558110a827f21 WatchSource:0}: Error finding container 74c7d987997096b66b94896c6c743a531ed69cc553ca19de86a558110a827f21: Status 404 returned error can't find the container with id 74c7d987997096b66b94896c6c743a531ed69cc553ca19de86a558110a827f21 Feb 02 13:13:57 crc kubenswrapper[4721]: I0202 13:13:57.406526 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvnj" event={"ID":"e4bc9245-cc92-4fa8-a195-74b2c1fa3018","Type":"ContainerStarted","Data":"8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822"} Feb 02 13:13:57 crc kubenswrapper[4721]: I0202 13:13:57.408025 4721 generic.go:334] "Generic (PLEG): container finished" podID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerID="312ecdfe1b8de131ebb4e994ad7253a242238c9f1bc7b82397680434a40896a4" exitCode=0 Feb 02 13:13:57 crc kubenswrapper[4721]: I0202 13:13:57.408126 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" event={"ID":"53fd5f54-e7fe-4d86-a5b7-3583e945fff3","Type":"ContainerDied","Data":"312ecdfe1b8de131ebb4e994ad7253a242238c9f1bc7b82397680434a40896a4"} Feb 02 13:13:57 crc kubenswrapper[4721]: I0202 13:13:57.408204 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" event={"ID":"53fd5f54-e7fe-4d86-a5b7-3583e945fff3","Type":"ContainerStarted","Data":"74c7d987997096b66b94896c6c743a531ed69cc553ca19de86a558110a827f21"} Feb 02 13:13:57 crc kubenswrapper[4721]: I0202 13:13:57.409525 4721 generic.go:334] "Generic (PLEG): container finished" podID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerID="9e659b420eb15f9adab7ad44219ca5d9df8a5bf8c29f5b8ff96c57b42e14f33d" exitCode=0 Feb 02 13:13:57 crc kubenswrapper[4721]: I0202 13:13:57.409563 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" event={"ID":"4b23cf05-2074-4e51-b6ef-235b207d8b16","Type":"ContainerDied","Data":"9e659b420eb15f9adab7ad44219ca5d9df8a5bf8c29f5b8ff96c57b42e14f33d"} Feb 02 13:13:57 crc kubenswrapper[4721]: I0202 13:13:57.426575 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ngvnj" podStartSLOduration=2.013986912 podStartE2EDuration="4.426555793s" podCreationTimestamp="2026-02-02 13:13:53 +0000 UTC" firstStartedPulling="2026-02-02 13:13:54.37237461 +0000 UTC m=+774.674888999" lastFinishedPulling="2026-02-02 13:13:56.784943481 +0000 UTC m=+777.087457880" observedRunningTime="2026-02-02 13:13:57.423189825 +0000 UTC m=+777.725704224" watchObservedRunningTime="2026-02-02 13:13:57.426555793 +0000 UTC m=+777.729070182" Feb 02 13:13:59 crc kubenswrapper[4721]: I0202 13:13:59.433942 4721 generic.go:334] "Generic (PLEG): container finished" podID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerID="5a6f2159311eaef61e3383c7faa5c40f41ba2c020d3b11829efc19fcfbe6797f" exitCode=0 Feb 02 13:13:59 crc kubenswrapper[4721]: I0202 13:13:59.434031 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" event={"ID":"53fd5f54-e7fe-4d86-a5b7-3583e945fff3","Type":"ContainerDied","Data":"5a6f2159311eaef61e3383c7faa5c40f41ba2c020d3b11829efc19fcfbe6797f"} Feb 02 13:13:59 crc kubenswrapper[4721]: I0202 13:13:59.437486 4721 generic.go:334] "Generic (PLEG): container finished" podID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerID="1c339f583f4558704296e38b771f88fc05eec64bae8e0485efe01b990760ac0a" exitCode=0 Feb 02 13:13:59 crc kubenswrapper[4721]: I0202 13:13:59.437575 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" event={"ID":"4b23cf05-2074-4e51-b6ef-235b207d8b16","Type":"ContainerDied","Data":"1c339f583f4558704296e38b771f88fc05eec64bae8e0485efe01b990760ac0a"} Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.316542 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jwkvv"] Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.317987 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.332945 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jwkvv"] Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.427671 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hshh8\" (UniqueName: \"kubernetes.io/projected/0240e395-0a12-40c4-b5e6-b31168b303ab-kube-api-access-hshh8\") pod \"redhat-operators-jwkvv\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.428002 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-utilities\") pod \"redhat-operators-jwkvv\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.428353 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-catalog-content\") pod \"redhat-operators-jwkvv\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.447485 4721 generic.go:334] "Generic (PLEG): container finished" podID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerID="aa72e2308b19fb71156a3c15943f1bc96bc8016ab1843ff013ca64b8faceff12" exitCode=0 Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.447572 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" event={"ID":"4b23cf05-2074-4e51-b6ef-235b207d8b16","Type":"ContainerDied","Data":"aa72e2308b19fb71156a3c15943f1bc96bc8016ab1843ff013ca64b8faceff12"} Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.452603 4721 generic.go:334] "Generic (PLEG): container finished" podID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerID="2a275a934a34e5cc5ab2dd6a550b7015bc3b971f0a41965316c7b176c3891331" exitCode=0 Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.452641 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" event={"ID":"53fd5f54-e7fe-4d86-a5b7-3583e945fff3","Type":"ContainerDied","Data":"2a275a934a34e5cc5ab2dd6a550b7015bc3b971f0a41965316c7b176c3891331"} Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.530146 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-catalog-content\") pod \"redhat-operators-jwkvv\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.530525 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hshh8\" (UniqueName: \"kubernetes.io/projected/0240e395-0a12-40c4-b5e6-b31168b303ab-kube-api-access-hshh8\") pod \"redhat-operators-jwkvv\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.530670 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-utilities\") pod \"redhat-operators-jwkvv\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.530789 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-catalog-content\") pod \"redhat-operators-jwkvv\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.531129 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-utilities\") pod \"redhat-operators-jwkvv\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.551105 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hshh8\" (UniqueName: \"kubernetes.io/projected/0240e395-0a12-40c4-b5e6-b31168b303ab-kube-api-access-hshh8\") pod \"redhat-operators-jwkvv\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.634085 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:00 crc kubenswrapper[4721]: I0202 13:14:00.883729 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jwkvv"] Feb 02 13:14:00 crc kubenswrapper[4721]: W0202 13:14:00.889002 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0240e395_0a12_40c4_b5e6_b31168b303ab.slice/crio-b9b8aa71e9272c3a83f81ba3e2d7a8de409acbf418a29cb8e7063ac3659e42e9 WatchSource:0}: Error finding container b9b8aa71e9272c3a83f81ba3e2d7a8de409acbf418a29cb8e7063ac3659e42e9: Status 404 returned error can't find the container with id b9b8aa71e9272c3a83f81ba3e2d7a8de409acbf418a29cb8e7063ac3659e42e9 Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.459549 4721 generic.go:334] "Generic (PLEG): container finished" podID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerID="3fdfe29f698f72ad98e08f0fea56f5e1ea04671f512653d91c811ab010ecba19" exitCode=0 Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.459609 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwkvv" event={"ID":"0240e395-0a12-40c4-b5e6-b31168b303ab","Type":"ContainerDied","Data":"3fdfe29f698f72ad98e08f0fea56f5e1ea04671f512653d91c811ab010ecba19"} Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.459649 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwkvv" event={"ID":"0240e395-0a12-40c4-b5e6-b31168b303ab","Type":"ContainerStarted","Data":"b9b8aa71e9272c3a83f81ba3e2d7a8de409acbf418a29cb8e7063ac3659e42e9"} Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.813246 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.815308 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.846803 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfcfv\" (UniqueName: \"kubernetes.io/projected/4b23cf05-2074-4e51-b6ef-235b207d8b16-kube-api-access-dfcfv\") pod \"4b23cf05-2074-4e51-b6ef-235b207d8b16\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.846840 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-util\") pod \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.846940 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-bundle\") pod \"4b23cf05-2074-4e51-b6ef-235b207d8b16\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.846959 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzlks\" (UniqueName: \"kubernetes.io/projected/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-kube-api-access-hzlks\") pod \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.847023 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-util\") pod \"4b23cf05-2074-4e51-b6ef-235b207d8b16\" (UID: \"4b23cf05-2074-4e51-b6ef-235b207d8b16\") " Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.847040 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-bundle\") pod \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\" (UID: \"53fd5f54-e7fe-4d86-a5b7-3583e945fff3\") " Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.848033 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-bundle" (OuterVolumeSpecName: "bundle") pod "4b23cf05-2074-4e51-b6ef-235b207d8b16" (UID: "4b23cf05-2074-4e51-b6ef-235b207d8b16"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.848383 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-bundle" (OuterVolumeSpecName: "bundle") pod "53fd5f54-e7fe-4d86-a5b7-3583e945fff3" (UID: "53fd5f54-e7fe-4d86-a5b7-3583e945fff3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.854398 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b23cf05-2074-4e51-b6ef-235b207d8b16-kube-api-access-dfcfv" (OuterVolumeSpecName: "kube-api-access-dfcfv") pod "4b23cf05-2074-4e51-b6ef-235b207d8b16" (UID: "4b23cf05-2074-4e51-b6ef-235b207d8b16"). InnerVolumeSpecName "kube-api-access-dfcfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.854424 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-kube-api-access-hzlks" (OuterVolumeSpecName: "kube-api-access-hzlks") pod "53fd5f54-e7fe-4d86-a5b7-3583e945fff3" (UID: "53fd5f54-e7fe-4d86-a5b7-3583e945fff3"). InnerVolumeSpecName "kube-api-access-hzlks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.866994 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-util" (OuterVolumeSpecName: "util") pod "53fd5f54-e7fe-4d86-a5b7-3583e945fff3" (UID: "53fd5f54-e7fe-4d86-a5b7-3583e945fff3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.887884 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-util" (OuterVolumeSpecName: "util") pod "4b23cf05-2074-4e51-b6ef-235b207d8b16" (UID: "4b23cf05-2074-4e51-b6ef-235b207d8b16"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.948746 4721 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.948785 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzlks\" (UniqueName: \"kubernetes.io/projected/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-kube-api-access-hzlks\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.948800 4721 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b23cf05-2074-4e51-b6ef-235b207d8b16-util\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.948812 4721 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.948826 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfcfv\" (UniqueName: \"kubernetes.io/projected/4b23cf05-2074-4e51-b6ef-235b207d8b16-kube-api-access-dfcfv\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:01 crc kubenswrapper[4721]: I0202 13:14:01.948839 4721 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fd5f54-e7fe-4d86-a5b7-3583e945fff3-util\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:02 crc kubenswrapper[4721]: I0202 13:14:02.467660 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" event={"ID":"53fd5f54-e7fe-4d86-a5b7-3583e945fff3","Type":"ContainerDied","Data":"74c7d987997096b66b94896c6c743a531ed69cc553ca19de86a558110a827f21"} Feb 02 13:14:02 crc kubenswrapper[4721]: I0202 13:14:02.467699 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74c7d987997096b66b94896c6c743a531ed69cc553ca19de86a558110a827f21" Feb 02 13:14:02 crc kubenswrapper[4721]: I0202 13:14:02.467678 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7" Feb 02 13:14:02 crc kubenswrapper[4721]: I0202 13:14:02.471348 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" event={"ID":"4b23cf05-2074-4e51-b6ef-235b207d8b16","Type":"ContainerDied","Data":"70890d5b395f40fb527c5817ac958b85be98e7537656d46714bf54f470595c89"} Feb 02 13:14:02 crc kubenswrapper[4721]: I0202 13:14:02.471382 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70890d5b395f40fb527c5817ac958b85be98e7537656d46714bf54f470595c89" Feb 02 13:14:02 crc kubenswrapper[4721]: I0202 13:14:02.471387 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n" Feb 02 13:14:02 crc kubenswrapper[4721]: I0202 13:14:02.475220 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwkvv" event={"ID":"0240e395-0a12-40c4-b5e6-b31168b303ab","Type":"ContainerStarted","Data":"67cb9052470ca9d2b58d9c98a6bbe0181b292c87088d890cf20bcc66f8f8c5b1"} Feb 02 13:14:03 crc kubenswrapper[4721]: I0202 13:14:03.483219 4721 generic.go:334] "Generic (PLEG): container finished" podID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerID="67cb9052470ca9d2b58d9c98a6bbe0181b292c87088d890cf20bcc66f8f8c5b1" exitCode=0 Feb 02 13:14:03 crc kubenswrapper[4721]: I0202 13:14:03.483463 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwkvv" event={"ID":"0240e395-0a12-40c4-b5e6-b31168b303ab","Type":"ContainerDied","Data":"67cb9052470ca9d2b58d9c98a6bbe0181b292c87088d890cf20bcc66f8f8c5b1"} Feb 02 13:14:03 crc kubenswrapper[4721]: I0202 13:14:03.694420 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:14:03 crc kubenswrapper[4721]: I0202 13:14:03.694464 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:14:03 crc kubenswrapper[4721]: I0202 13:14:03.736512 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:14:04 crc kubenswrapper[4721]: I0202 13:14:04.492748 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwkvv" event={"ID":"0240e395-0a12-40c4-b5e6-b31168b303ab","Type":"ContainerStarted","Data":"d553ccad788dc78c0a62e5fdc6b75b154047af412e476df76d3a46cdee3d9f1e"} Feb 02 13:14:04 crc kubenswrapper[4721]: I0202 13:14:04.515446 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jwkvv" podStartSLOduration=2.023563493 podStartE2EDuration="4.515425717s" podCreationTimestamp="2026-02-02 13:14:00 +0000 UTC" firstStartedPulling="2026-02-02 13:14:01.461340756 +0000 UTC m=+781.763855145" lastFinishedPulling="2026-02-02 13:14:03.95320296 +0000 UTC m=+784.255717369" observedRunningTime="2026-02-02 13:14:04.511476144 +0000 UTC m=+784.813990533" watchObservedRunningTime="2026-02-02 13:14:04.515425717 +0000 UTC m=+784.817940106" Feb 02 13:14:04 crc kubenswrapper[4721]: I0202 13:14:04.532425 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:14:07 crc kubenswrapper[4721]: I0202 13:14:07.711779 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngvnj"] Feb 02 13:14:07 crc kubenswrapper[4721]: I0202 13:14:07.711996 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ngvnj" podUID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerName="registry-server" containerID="cri-o://8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822" gracePeriod=2 Feb 02 13:14:08 crc kubenswrapper[4721]: E0202 13:14:08.796804 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4bc9245_cc92_4fa8_a195_74b2c1fa3018.slice/crio-conmon-8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822.scope\": RecentStats: unable to find data in memory cache]" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.286639 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.365601 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-utilities\") pod \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.365719 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-catalog-content\") pod \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.365813 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsw8s\" (UniqueName: \"kubernetes.io/projected/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-kube-api-access-fsw8s\") pod \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\" (UID: \"e4bc9245-cc92-4fa8-a195-74b2c1fa3018\") " Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.366625 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-utilities" (OuterVolumeSpecName: "utilities") pod "e4bc9245-cc92-4fa8-a195-74b2c1fa3018" (UID: "e4bc9245-cc92-4fa8-a195-74b2c1fa3018"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.371263 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-kube-api-access-fsw8s" (OuterVolumeSpecName: "kube-api-access-fsw8s") pod "e4bc9245-cc92-4fa8-a195-74b2c1fa3018" (UID: "e4bc9245-cc92-4fa8-a195-74b2c1fa3018"). InnerVolumeSpecName "kube-api-access-fsw8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.421482 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4bc9245-cc92-4fa8-a195-74b2c1fa3018" (UID: "e4bc9245-cc92-4fa8-a195-74b2c1fa3018"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.467142 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsw8s\" (UniqueName: \"kubernetes.io/projected/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-kube-api-access-fsw8s\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.467185 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.467196 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bc9245-cc92-4fa8-a195-74b2c1fa3018-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.521187 4721 generic.go:334] "Generic (PLEG): container finished" podID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerID="8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822" exitCode=0 Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.521226 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvnj" event={"ID":"e4bc9245-cc92-4fa8-a195-74b2c1fa3018","Type":"ContainerDied","Data":"8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822"} Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.521260 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ngvnj" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.521273 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ngvnj" event={"ID":"e4bc9245-cc92-4fa8-a195-74b2c1fa3018","Type":"ContainerDied","Data":"8c61895cb6dd6e4aec63d8237d91407d9d43fbbfa704efe9db664fadf6487376"} Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.521290 4721 scope.go:117] "RemoveContainer" containerID="8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.542296 4721 scope.go:117] "RemoveContainer" containerID="56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.557442 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ngvnj"] Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.563355 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ngvnj"] Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.578689 4721 scope.go:117] "RemoveContainer" containerID="992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.595795 4721 scope.go:117] "RemoveContainer" containerID="8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822" Feb 02 13:14:09 crc kubenswrapper[4721]: E0202 13:14:09.596226 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822\": container with ID starting with 8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822 not found: ID does not exist" containerID="8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.596274 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822"} err="failed to get container status \"8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822\": rpc error: code = NotFound desc = could not find container \"8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822\": container with ID starting with 8d592e0cd2e9861f1bf77ee911de6dc93a57e75ad1d2d6069f5f8a4d50abb822 not found: ID does not exist" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.596303 4721 scope.go:117] "RemoveContainer" containerID="56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe" Feb 02 13:14:09 crc kubenswrapper[4721]: E0202 13:14:09.596864 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe\": container with ID starting with 56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe not found: ID does not exist" containerID="56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.596980 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe"} err="failed to get container status \"56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe\": rpc error: code = NotFound desc = could not find container \"56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe\": container with ID starting with 56eb0bf573c34b6796229e31bbdcd14a34f6bd281bbf7f087dc4989701986dbe not found: ID does not exist" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.597095 4721 scope.go:117] "RemoveContainer" containerID="992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d" Feb 02 13:14:09 crc kubenswrapper[4721]: E0202 13:14:09.598310 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d\": container with ID starting with 992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d not found: ID does not exist" containerID="992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d" Feb 02 13:14:09 crc kubenswrapper[4721]: I0202 13:14:09.598347 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d"} err="failed to get container status \"992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d\": rpc error: code = NotFound desc = could not find container \"992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d\": container with ID starting with 992bd2a820ea0937516ce359186e2efc41e06c8257ba292dc7b7f5246e9d846d not found: ID does not exist" Feb 02 13:14:10 crc kubenswrapper[4721]: I0202 13:14:10.417879 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" path="/var/lib/kubelet/pods/e4bc9245-cc92-4fa8-a195-74b2c1fa3018/volumes" Feb 02 13:14:10 crc kubenswrapper[4721]: I0202 13:14:10.634346 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:10 crc kubenswrapper[4721]: I0202 13:14:10.634717 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:10 crc kubenswrapper[4721]: I0202 13:14:10.677996 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:11 crc kubenswrapper[4721]: I0202 13:14:11.582195 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.137991 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5"] Feb 02 13:14:12 crc kubenswrapper[4721]: E0202 13:14:12.138298 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerName="pull" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138320 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerName="pull" Feb 02 13:14:12 crc kubenswrapper[4721]: E0202 13:14:12.138338 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerName="extract-content" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138347 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerName="extract-content" Feb 02 13:14:12 crc kubenswrapper[4721]: E0202 13:14:12.138356 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerName="util" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138365 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerName="util" Feb 02 13:14:12 crc kubenswrapper[4721]: E0202 13:14:12.138381 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerName="registry-server" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138388 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerName="registry-server" Feb 02 13:14:12 crc kubenswrapper[4721]: E0202 13:14:12.138402 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerName="extract-utilities" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138410 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerName="extract-utilities" Feb 02 13:14:12 crc kubenswrapper[4721]: E0202 13:14:12.138420 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerName="pull" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138427 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerName="pull" Feb 02 13:14:12 crc kubenswrapper[4721]: E0202 13:14:12.138437 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerName="extract" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138444 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerName="extract" Feb 02 13:14:12 crc kubenswrapper[4721]: E0202 13:14:12.138459 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerName="extract" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138467 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerName="extract" Feb 02 13:14:12 crc kubenswrapper[4721]: E0202 13:14:12.138482 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerName="util" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138489 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerName="util" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138631 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4bc9245-cc92-4fa8-a195-74b2c1fa3018" containerName="registry-server" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138645 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="53fd5f54-e7fe-4d86-a5b7-3583e945fff3" containerName="extract" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.138657 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b23cf05-2074-4e51-b6ef-235b207d8b16" containerName="extract" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.139334 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.141403 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-nqmwn" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.141670 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.142115 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.142181 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.142213 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.148162 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.156261 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5"] Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.202588 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4mq8\" (UniqueName: \"kubernetes.io/projected/af1123f2-fce6-410b-a82b-9b292bb8bf68-kube-api-access-c4mq8\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.202669 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af1123f2-fce6-410b-a82b-9b292bb8bf68-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.202700 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/af1123f2-fce6-410b-a82b-9b292bb8bf68-manager-config\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.202924 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af1123f2-fce6-410b-a82b-9b292bb8bf68-apiservice-cert\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.203052 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af1123f2-fce6-410b-a82b-9b292bb8bf68-webhook-cert\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.304350 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af1123f2-fce6-410b-a82b-9b292bb8bf68-apiservice-cert\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.304410 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af1123f2-fce6-410b-a82b-9b292bb8bf68-webhook-cert\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.304435 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4mq8\" (UniqueName: \"kubernetes.io/projected/af1123f2-fce6-410b-a82b-9b292bb8bf68-kube-api-access-c4mq8\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.304472 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af1123f2-fce6-410b-a82b-9b292bb8bf68-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.304497 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/af1123f2-fce6-410b-a82b-9b292bb8bf68-manager-config\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.305540 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/af1123f2-fce6-410b-a82b-9b292bb8bf68-manager-config\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.317014 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/af1123f2-fce6-410b-a82b-9b292bb8bf68-apiservice-cert\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.329111 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/af1123f2-fce6-410b-a82b-9b292bb8bf68-webhook-cert\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.331828 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af1123f2-fce6-410b-a82b-9b292bb8bf68-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.337035 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4mq8\" (UniqueName: \"kubernetes.io/projected/af1123f2-fce6-410b-a82b-9b292bb8bf68-kube-api-access-c4mq8\") pod \"loki-operator-controller-manager-756566789b-zpsf5\" (UID: \"af1123f2-fce6-410b-a82b-9b292bb8bf68\") " pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.459976 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:12 crc kubenswrapper[4721]: I0202 13:14:12.768672 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5"] Feb 02 13:14:12 crc kubenswrapper[4721]: W0202 13:14:12.780229 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf1123f2_fce6_410b_a82b_9b292bb8bf68.slice/crio-59d16b5f698762c1653aab76a89de977017af837e0a4202956105b348b163a34 WatchSource:0}: Error finding container 59d16b5f698762c1653aab76a89de977017af837e0a4202956105b348b163a34: Status 404 returned error can't find the container with id 59d16b5f698762c1653aab76a89de977017af837e0a4202956105b348b163a34 Feb 02 13:14:13 crc kubenswrapper[4721]: I0202 13:14:13.550596 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" event={"ID":"af1123f2-fce6-410b-a82b-9b292bb8bf68","Type":"ContainerStarted","Data":"59d16b5f698762c1653aab76a89de977017af837e0a4202956105b348b163a34"} Feb 02 13:14:14 crc kubenswrapper[4721]: I0202 13:14:14.764269 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:14:14 crc kubenswrapper[4721]: I0202 13:14:14.764336 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:14:14 crc kubenswrapper[4721]: I0202 13:14:14.926267 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv"] Feb 02 13:14:14 crc kubenswrapper[4721]: I0202 13:14:14.927184 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv" Feb 02 13:14:14 crc kubenswrapper[4721]: I0202 13:14:14.929613 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Feb 02 13:14:14 crc kubenswrapper[4721]: I0202 13:14:14.929674 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Feb 02 13:14:14 crc kubenswrapper[4721]: I0202 13:14:14.930087 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-rs8dq" Feb 02 13:14:14 crc kubenswrapper[4721]: I0202 13:14:14.944738 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv"] Feb 02 13:14:14 crc kubenswrapper[4721]: I0202 13:14:14.947580 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5xxl\" (UniqueName: \"kubernetes.io/projected/f9c5d281-206d-4729-a031-feb5b9234c8f-kube-api-access-v5xxl\") pod \"cluster-logging-operator-79cf69ddc8-9sfnv\" (UID: \"f9c5d281-206d-4729-a031-feb5b9234c8f\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.049480 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5xxl\" (UniqueName: \"kubernetes.io/projected/f9c5d281-206d-4729-a031-feb5b9234c8f-kube-api-access-v5xxl\") pod \"cluster-logging-operator-79cf69ddc8-9sfnv\" (UID: \"f9c5d281-206d-4729-a031-feb5b9234c8f\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.073991 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5xxl\" (UniqueName: \"kubernetes.io/projected/f9c5d281-206d-4729-a031-feb5b9234c8f-kube-api-access-v5xxl\") pod \"cluster-logging-operator-79cf69ddc8-9sfnv\" (UID: \"f9c5d281-206d-4729-a031-feb5b9234c8f\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.107690 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jwkvv"] Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.108807 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jwkvv" podUID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerName="registry-server" containerID="cri-o://d553ccad788dc78c0a62e5fdc6b75b154047af412e476df76d3a46cdee3d9f1e" gracePeriod=2 Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.279419 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.604680 4721 generic.go:334] "Generic (PLEG): container finished" podID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerID="d553ccad788dc78c0a62e5fdc6b75b154047af412e476df76d3a46cdee3d9f1e" exitCode=0 Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.605159 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwkvv" event={"ID":"0240e395-0a12-40c4-b5e6-b31168b303ab","Type":"ContainerDied","Data":"d553ccad788dc78c0a62e5fdc6b75b154047af412e476df76d3a46cdee3d9f1e"} Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.643734 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.660104 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-utilities\") pod \"0240e395-0a12-40c4-b5e6-b31168b303ab\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.660200 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-catalog-content\") pod \"0240e395-0a12-40c4-b5e6-b31168b303ab\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.660240 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hshh8\" (UniqueName: \"kubernetes.io/projected/0240e395-0a12-40c4-b5e6-b31168b303ab-kube-api-access-hshh8\") pod \"0240e395-0a12-40c4-b5e6-b31168b303ab\" (UID: \"0240e395-0a12-40c4-b5e6-b31168b303ab\") " Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.661017 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-utilities" (OuterVolumeSpecName: "utilities") pod "0240e395-0a12-40c4-b5e6-b31168b303ab" (UID: "0240e395-0a12-40c4-b5e6-b31168b303ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.670897 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0240e395-0a12-40c4-b5e6-b31168b303ab-kube-api-access-hshh8" (OuterVolumeSpecName: "kube-api-access-hshh8") pod "0240e395-0a12-40c4-b5e6-b31168b303ab" (UID: "0240e395-0a12-40c4-b5e6-b31168b303ab"). InnerVolumeSpecName "kube-api-access-hshh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.682899 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv"] Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.761681 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.761712 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hshh8\" (UniqueName: \"kubernetes.io/projected/0240e395-0a12-40c4-b5e6-b31168b303ab-kube-api-access-hshh8\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.844177 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0240e395-0a12-40c4-b5e6-b31168b303ab" (UID: "0240e395-0a12-40c4-b5e6-b31168b303ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:15 crc kubenswrapper[4721]: I0202 13:14:15.864374 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0240e395-0a12-40c4-b5e6-b31168b303ab-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:16 crc kubenswrapper[4721]: I0202 13:14:16.613130 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv" event={"ID":"f9c5d281-206d-4729-a031-feb5b9234c8f","Type":"ContainerStarted","Data":"f952ffe5054eaddc53bbf21a0ed35d42abd52a1957b93939bfbaa5c325764b2e"} Feb 02 13:14:16 crc kubenswrapper[4721]: I0202 13:14:16.615282 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jwkvv" event={"ID":"0240e395-0a12-40c4-b5e6-b31168b303ab","Type":"ContainerDied","Data":"b9b8aa71e9272c3a83f81ba3e2d7a8de409acbf418a29cb8e7063ac3659e42e9"} Feb 02 13:14:16 crc kubenswrapper[4721]: I0202 13:14:16.615339 4721 scope.go:117] "RemoveContainer" containerID="d553ccad788dc78c0a62e5fdc6b75b154047af412e476df76d3a46cdee3d9f1e" Feb 02 13:14:16 crc kubenswrapper[4721]: I0202 13:14:16.615439 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jwkvv" Feb 02 13:14:16 crc kubenswrapper[4721]: I0202 13:14:16.637725 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jwkvv"] Feb 02 13:14:16 crc kubenswrapper[4721]: I0202 13:14:16.642744 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jwkvv"] Feb 02 13:14:18 crc kubenswrapper[4721]: I0202 13:14:18.427875 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0240e395-0a12-40c4-b5e6-b31168b303ab" path="/var/lib/kubelet/pods/0240e395-0a12-40c4-b5e6-b31168b303ab/volumes" Feb 02 13:14:18 crc kubenswrapper[4721]: I0202 13:14:18.478600 4721 scope.go:117] "RemoveContainer" containerID="67cb9052470ca9d2b58d9c98a6bbe0181b292c87088d890cf20bcc66f8f8c5b1" Feb 02 13:14:18 crc kubenswrapper[4721]: I0202 13:14:18.503559 4721 scope.go:117] "RemoveContainer" containerID="3fdfe29f698f72ad98e08f0fea56f5e1ea04671f512653d91c811ab010ecba19" Feb 02 13:14:19 crc kubenswrapper[4721]: I0202 13:14:19.643121 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" event={"ID":"af1123f2-fce6-410b-a82b-9b292bb8bf68","Type":"ContainerStarted","Data":"ff92a3c5e4759b22495907be8527b3528c9739486111eb31ba8deee32f0fab14"} Feb 02 13:14:27 crc kubenswrapper[4721]: I0202 13:14:27.692643 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv" event={"ID":"f9c5d281-206d-4729-a031-feb5b9234c8f","Type":"ContainerStarted","Data":"ba5f0130ce3b917509918bb04b21db20e35aa5b25888b40e07c39452af518a77"} Feb 02 13:14:27 crc kubenswrapper[4721]: I0202 13:14:27.698730 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" event={"ID":"af1123f2-fce6-410b-a82b-9b292bb8bf68","Type":"ContainerStarted","Data":"f13de78a119377f101d453a3ce4e9a78e1680646f62d40fcb81d8ed07dbb9210"} Feb 02 13:14:27 crc kubenswrapper[4721]: I0202 13:14:27.713034 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-9sfnv" podStartSLOduration=2.765682576 podStartE2EDuration="13.713015033s" podCreationTimestamp="2026-02-02 13:14:14 +0000 UTC" firstStartedPulling="2026-02-02 13:14:15.725611959 +0000 UTC m=+796.028126348" lastFinishedPulling="2026-02-02 13:14:26.672944426 +0000 UTC m=+806.975458805" observedRunningTime="2026-02-02 13:14:27.709827019 +0000 UTC m=+808.012341408" watchObservedRunningTime="2026-02-02 13:14:27.713015033 +0000 UTC m=+808.015529422" Feb 02 13:14:27 crc kubenswrapper[4721]: I0202 13:14:27.750632 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" podStartSLOduration=1.784592833 podStartE2EDuration="15.750608845s" podCreationTimestamp="2026-02-02 13:14:12 +0000 UTC" firstStartedPulling="2026-02-02 13:14:12.782325382 +0000 UTC m=+793.084839771" lastFinishedPulling="2026-02-02 13:14:26.748341394 +0000 UTC m=+807.050855783" observedRunningTime="2026-02-02 13:14:27.747544894 +0000 UTC m=+808.050059293" watchObservedRunningTime="2026-02-02 13:14:27.750608845 +0000 UTC m=+808.053123244" Feb 02 13:14:28 crc kubenswrapper[4721]: I0202 13:14:28.704472 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:28 crc kubenswrapper[4721]: I0202 13:14:28.706574 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-756566789b-zpsf5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.185534 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2j8r5"] Feb 02 13:14:33 crc kubenswrapper[4721]: E0202 13:14:33.186379 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerName="extract-content" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.186392 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerName="extract-content" Feb 02 13:14:33 crc kubenswrapper[4721]: E0202 13:14:33.186408 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerName="registry-server" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.186416 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerName="registry-server" Feb 02 13:14:33 crc kubenswrapper[4721]: E0202 13:14:33.186424 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerName="extract-utilities" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.186432 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerName="extract-utilities" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.186559 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0240e395-0a12-40c4-b5e6-b31168b303ab" containerName="registry-server" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.187448 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.205049 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j8r5"] Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.355201 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-catalog-content\") pod \"redhat-marketplace-2j8r5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.355295 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p27h\" (UniqueName: \"kubernetes.io/projected/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-kube-api-access-4p27h\") pod \"redhat-marketplace-2j8r5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.355316 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-utilities\") pod \"redhat-marketplace-2j8r5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.456949 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-catalog-content\") pod \"redhat-marketplace-2j8r5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.457338 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p27h\" (UniqueName: \"kubernetes.io/projected/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-kube-api-access-4p27h\") pod \"redhat-marketplace-2j8r5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.457361 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-utilities\") pod \"redhat-marketplace-2j8r5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.457613 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-catalog-content\") pod \"redhat-marketplace-2j8r5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.457968 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-utilities\") pod \"redhat-marketplace-2j8r5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.481278 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p27h\" (UniqueName: \"kubernetes.io/projected/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-kube-api-access-4p27h\") pod \"redhat-marketplace-2j8r5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.508837 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.730482 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j8r5"] Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.733414 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j8r5" event={"ID":"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5","Type":"ContainerStarted","Data":"c4a7acf51223f45f628cd91f41481e7aede65390c1e3de98544e03d9db406fc1"} Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.849678 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.850670 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.853688 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.856013 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.859949 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.964021 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a9223aca-8ddb-4dbe-870b-600d3969f728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a9223aca-8ddb-4dbe-870b-600d3969f728\") pod \"minio\" (UID: \"9648319e-f888-4996-976b-f17c6e130cde\") " pod="minio-dev/minio" Feb 02 13:14:33 crc kubenswrapper[4721]: I0202 13:14:33.964181 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnb6p\" (UniqueName: \"kubernetes.io/projected/9648319e-f888-4996-976b-f17c6e130cde-kube-api-access-nnb6p\") pod \"minio\" (UID: \"9648319e-f888-4996-976b-f17c6e130cde\") " pod="minio-dev/minio" Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.065471 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnb6p\" (UniqueName: \"kubernetes.io/projected/9648319e-f888-4996-976b-f17c6e130cde-kube-api-access-nnb6p\") pod \"minio\" (UID: \"9648319e-f888-4996-976b-f17c6e130cde\") " pod="minio-dev/minio" Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.065600 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a9223aca-8ddb-4dbe-870b-600d3969f728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a9223aca-8ddb-4dbe-870b-600d3969f728\") pod \"minio\" (UID: \"9648319e-f888-4996-976b-f17c6e130cde\") " pod="minio-dev/minio" Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.068556 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.068602 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a9223aca-8ddb-4dbe-870b-600d3969f728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a9223aca-8ddb-4dbe-870b-600d3969f728\") pod \"minio\" (UID: \"9648319e-f888-4996-976b-f17c6e130cde\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f287cb52ed049d2ea91d091276fa08fb761d43203bac285e5deb1efbd9b1dabf/globalmount\"" pod="minio-dev/minio" Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.092092 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a9223aca-8ddb-4dbe-870b-600d3969f728\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a9223aca-8ddb-4dbe-870b-600d3969f728\") pod \"minio\" (UID: \"9648319e-f888-4996-976b-f17c6e130cde\") " pod="minio-dev/minio" Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.101556 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnb6p\" (UniqueName: \"kubernetes.io/projected/9648319e-f888-4996-976b-f17c6e130cde-kube-api-access-nnb6p\") pod \"minio\" (UID: \"9648319e-f888-4996-976b-f17c6e130cde\") " pod="minio-dev/minio" Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.197312 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.391521 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.752345 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"9648319e-f888-4996-976b-f17c6e130cde","Type":"ContainerStarted","Data":"debbb5bcb668fcd076d496652b6eed619f4a71ae606d31608913d3ffbb8fc23f"} Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.754521 4721 generic.go:334] "Generic (PLEG): container finished" podID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerID="257023ea900b686319d684f93bf86174efc5e0727f68b5296e819f6673afb9f7" exitCode=0 Feb 02 13:14:34 crc kubenswrapper[4721]: I0202 13:14:34.754561 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j8r5" event={"ID":"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5","Type":"ContainerDied","Data":"257023ea900b686319d684f93bf86174efc5e0727f68b5296e819f6673afb9f7"} Feb 02 13:14:37 crc kubenswrapper[4721]: I0202 13:14:37.773349 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"9648319e-f888-4996-976b-f17c6e130cde","Type":"ContainerStarted","Data":"3565b34d86d02a3d4322f46c7a25c2ed69b11c2d2182abf9f8760caf8949512d"} Feb 02 13:14:37 crc kubenswrapper[4721]: I0202 13:14:37.775335 4721 generic.go:334] "Generic (PLEG): container finished" podID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerID="e28c5c9633828095d9e468bec9d5dca15024d1f9adc02fe17ec54b56e17309d6" exitCode=0 Feb 02 13:14:37 crc kubenswrapper[4721]: I0202 13:14:37.775360 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j8r5" event={"ID":"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5","Type":"ContainerDied","Data":"e28c5c9633828095d9e468bec9d5dca15024d1f9adc02fe17ec54b56e17309d6"} Feb 02 13:14:37 crc kubenswrapper[4721]: I0202 13:14:37.791738 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.995107217 podStartE2EDuration="6.791721204s" podCreationTimestamp="2026-02-02 13:14:31 +0000 UTC" firstStartedPulling="2026-02-02 13:14:34.397324941 +0000 UTC m=+814.699839330" lastFinishedPulling="2026-02-02 13:14:37.193938928 +0000 UTC m=+817.496453317" observedRunningTime="2026-02-02 13:14:37.786760023 +0000 UTC m=+818.089274402" watchObservedRunningTime="2026-02-02 13:14:37.791721204 +0000 UTC m=+818.094235593" Feb 02 13:14:38 crc kubenswrapper[4721]: I0202 13:14:38.784125 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j8r5" event={"ID":"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5","Type":"ContainerStarted","Data":"489cc6a3cbe57f3f7a1a1c8c11ef02d4ab91eb4ecc18fe6c0a2d47b0318d8936"} Feb 02 13:14:38 crc kubenswrapper[4721]: I0202 13:14:38.806007 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2j8r5" podStartSLOduration=2.380270372 podStartE2EDuration="5.805987721s" podCreationTimestamp="2026-02-02 13:14:33 +0000 UTC" firstStartedPulling="2026-02-02 13:14:34.756703396 +0000 UTC m=+815.059217825" lastFinishedPulling="2026-02-02 13:14:38.182420785 +0000 UTC m=+818.484935174" observedRunningTime="2026-02-02 13:14:38.802457259 +0000 UTC m=+819.104971658" watchObservedRunningTime="2026-02-02 13:14:38.805987721 +0000 UTC m=+819.108502120" Feb 02 13:14:43 crc kubenswrapper[4721]: I0202 13:14:43.509720 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:43 crc kubenswrapper[4721]: I0202 13:14:43.510277 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:43 crc kubenswrapper[4721]: I0202 13:14:43.551786 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:43 crc kubenswrapper[4721]: I0202 13:14:43.862808 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:43 crc kubenswrapper[4721]: I0202 13:14:43.905609 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j8r5"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.118655 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.119944 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.123835 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.123919 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.128604 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.128777 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-qw8n7" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.134148 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.134196 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.222026 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/7a392a7d-824d-420d-bf0d-66ca95134ea6-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.222096 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a392a7d-824d-420d-bf0d-66ca95134ea6-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.222121 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7a392a7d-824d-420d-bf0d-66ca95134ea6-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.222158 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75dm\" (UniqueName: \"kubernetes.io/projected/7a392a7d-824d-420d-bf0d-66ca95134ea6-kube-api-access-z75dm\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.222176 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a392a7d-824d-420d-bf0d-66ca95134ea6-config\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.281943 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-mnp7c"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.282937 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.286360 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.286383 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.286833 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.310417 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-mnp7c"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.323926 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/7a392a7d-824d-420d-bf0d-66ca95134ea6-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.323991 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a392a7d-824d-420d-bf0d-66ca95134ea6-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.324020 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7a392a7d-824d-420d-bf0d-66ca95134ea6-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.324089 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a392a7d-824d-420d-bf0d-66ca95134ea6-config\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.324115 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75dm\" (UniqueName: \"kubernetes.io/projected/7a392a7d-824d-420d-bf0d-66ca95134ea6-kube-api-access-z75dm\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.335279 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a392a7d-824d-420d-bf0d-66ca95134ea6-config\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.335992 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7a392a7d-824d-420d-bf0d-66ca95134ea6-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.338953 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/7a392a7d-824d-420d-bf0d-66ca95134ea6-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.368354 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/7a392a7d-824d-420d-bf0d-66ca95134ea6-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.379452 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75dm\" (UniqueName: \"kubernetes.io/projected/7a392a7d-824d-420d-bf0d-66ca95134ea6-kube-api-access-z75dm\") pod \"logging-loki-distributor-5f678c8dd6-tb6gs\" (UID: \"7a392a7d-824d-420d-bf0d-66ca95134ea6\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.425273 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-s3\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.425659 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.425707 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.425749 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.425820 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdcff\" (UniqueName: \"kubernetes.io/projected/98490098-f31f-4ee3-9f15-ee37b8740035-kube-api-access-rdcff\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.425873 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98490098-f31f-4ee3-9f15-ee37b8740035-config\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.430469 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-xs62z"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.434906 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.441746 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-xs62z"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.442026 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.443618 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.443813 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527342 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527477 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d83a8b-3334-43f3-b417-58a7fbd7282c-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527513 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527551 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527575 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdmxq\" (UniqueName: \"kubernetes.io/projected/93d83a8b-3334-43f3-b417-58a7fbd7282c-kube-api-access-kdmxq\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527670 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d83a8b-3334-43f3-b417-58a7fbd7282c-config\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527732 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdcff\" (UniqueName: \"kubernetes.io/projected/98490098-f31f-4ee3-9f15-ee37b8740035-kube-api-access-rdcff\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527823 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/93d83a8b-3334-43f3-b417-58a7fbd7282c-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527851 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98490098-f31f-4ee3-9f15-ee37b8740035-config\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527898 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/93d83a8b-3334-43f3-b417-58a7fbd7282c-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.527950 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-s3\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.531684 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98490098-f31f-4ee3-9f15-ee37b8740035-config\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.536129 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5f86bf5685-p62nr"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.537464 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.537568 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.538042 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-5f86bf5685-lsthj"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.539432 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.540856 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.546840 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.547389 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.547637 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.547783 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.547938 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-mlm2x" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.548081 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.548371 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/98490098-f31f-4ee3-9f15-ee37b8740035-logging-loki-s3\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.548997 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.561175 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdcff\" (UniqueName: \"kubernetes.io/projected/98490098-f31f-4ee3-9f15-ee37b8740035-kube-api-access-rdcff\") pod \"logging-loki-querier-76788598db-mnp7c\" (UID: \"98490098-f31f-4ee3-9f15-ee37b8740035\") " pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.578359 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5f86bf5685-p62nr"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.584799 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5f86bf5685-lsthj"] Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.603492 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629767 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-tenants\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629806 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hhh4\" (UniqueName: \"kubernetes.io/projected/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-kube-api-access-7hhh4\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629830 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629849 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-tls-secret\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629871 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d83a8b-3334-43f3-b417-58a7fbd7282c-config\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629903 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629925 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629948 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629964 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-tls-secret\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.629987 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/93d83a8b-3334-43f3-b417-58a7fbd7282c-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630006 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj56x\" (UniqueName: \"kubernetes.io/projected/e0a2094f-7b9c-426c-b7ea-6a175be407f1-kube-api-access-sj56x\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630027 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/93d83a8b-3334-43f3-b417-58a7fbd7282c-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630043 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-rbac\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630090 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630128 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-lokistack-gateway\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630147 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d83a8b-3334-43f3-b417-58a7fbd7282c-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630168 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-rbac\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630185 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-tenants\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630207 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdmxq\" (UniqueName: \"kubernetes.io/projected/93d83a8b-3334-43f3-b417-58a7fbd7282c-kube-api-access-kdmxq\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630223 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-lokistack-gateway\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.630241 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.631633 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d83a8b-3334-43f3-b417-58a7fbd7282c-config\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.640263 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/93d83a8b-3334-43f3-b417-58a7fbd7282c-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.644381 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/93d83a8b-3334-43f3-b417-58a7fbd7282c-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.645056 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d83a8b-3334-43f3-b417-58a7fbd7282c-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.662446 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdmxq\" (UniqueName: \"kubernetes.io/projected/93d83a8b-3334-43f3-b417-58a7fbd7282c-kube-api-access-kdmxq\") pod \"logging-loki-query-frontend-69d9546745-xs62z\" (UID: \"93d83a8b-3334-43f3-b417-58a7fbd7282c\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732036 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732418 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732483 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732511 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-tls-secret\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732533 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj56x\" (UniqueName: \"kubernetes.io/projected/e0a2094f-7b9c-426c-b7ea-6a175be407f1-kube-api-access-sj56x\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732556 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-rbac\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732581 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732612 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-lokistack-gateway\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732636 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-rbac\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732655 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-tenants\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732677 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-lokistack-gateway\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732693 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732725 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-tenants\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732741 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hhh4\" (UniqueName: \"kubernetes.io/projected/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-kube-api-access-7hhh4\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732757 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.732773 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-tls-secret\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: E0202 13:14:44.732894 4721 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Feb 02 13:14:44 crc kubenswrapper[4721]: E0202 13:14:44.732945 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-tls-secret podName:e0a2094f-7b9c-426c-b7ea-6a175be407f1 nodeName:}" failed. No retries permitted until 2026-02-02 13:14:45.232926421 +0000 UTC m=+825.535440810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-tls-secret") pod "logging-loki-gateway-5f86bf5685-p62nr" (UID: "e0a2094f-7b9c-426c-b7ea-6a175be407f1") : secret "logging-loki-gateway-http" not found Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.733090 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: E0202 13:14:44.734096 4721 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Feb 02 13:14:44 crc kubenswrapper[4721]: E0202 13:14:44.734152 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-tls-secret podName:6bbaf0c4-9bfc-4cf9-b238-4f494e492243 nodeName:}" failed. No retries permitted until 2026-02-02 13:14:45.234132023 +0000 UTC m=+825.536646472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-tls-secret") pod "logging-loki-gateway-5f86bf5685-lsthj" (UID: "6bbaf0c4-9bfc-4cf9-b238-4f494e492243") : secret "logging-loki-gateway-http" not found Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.734248 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-rbac\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.734382 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.734863 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-lokistack-gateway\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.734963 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-lokistack-gateway\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.735049 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-rbac\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.735604 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0a2094f-7b9c-426c-b7ea-6a175be407f1-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.736630 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-logging-loki-ca-bundle\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.736859 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.740304 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.742512 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-tenants\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.752998 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hhh4\" (UniqueName: \"kubernetes.io/projected/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-kube-api-access-7hhh4\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.753771 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj56x\" (UniqueName: \"kubernetes.io/projected/e0a2094f-7b9c-426c-b7ea-6a175be407f1-kube-api-access-sj56x\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.754703 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-tenants\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.760784 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.764696 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.764731 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:14:44 crc kubenswrapper[4721]: I0202 13:14:44.982471 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs"] Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.072855 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-mnp7c"] Feb 02 13:14:45 crc kubenswrapper[4721]: W0202 13:14:45.078383 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98490098_f31f_4ee3_9f15_ee37b8740035.slice/crio-fb8e6e345bdefed548652ef504a4b3699207acbcab99af43a7672fc56787ee01 WatchSource:0}: Error finding container fb8e6e345bdefed548652ef504a4b3699207acbcab99af43a7672fc56787ee01: Status 404 returned error can't find the container with id fb8e6e345bdefed548652ef504a4b3699207acbcab99af43a7672fc56787ee01 Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.209821 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-xs62z"] Feb 02 13:14:45 crc kubenswrapper[4721]: W0202 13:14:45.211763 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93d83a8b_3334_43f3_b417_58a7fbd7282c.slice/crio-9223b60892b923767e4ceb8693db69ab690eda2f495ec0aee3e1d718b91f663d WatchSource:0}: Error finding container 9223b60892b923767e4ceb8693db69ab690eda2f495ec0aee3e1d718b91f663d: Status 404 returned error can't find the container with id 9223b60892b923767e4ceb8693db69ab690eda2f495ec0aee3e1d718b91f663d Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.240519 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-tls-secret\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.240602 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-tls-secret\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.243952 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6bbaf0c4-9bfc-4cf9-b238-4f494e492243-tls-secret\") pod \"logging-loki-gateway-5f86bf5685-lsthj\" (UID: \"6bbaf0c4-9bfc-4cf9-b238-4f494e492243\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.244448 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e0a2094f-7b9c-426c-b7ea-6a175be407f1-tls-secret\") pod \"logging-loki-gateway-5f86bf5685-p62nr\" (UID: \"e0a2094f-7b9c-426c-b7ea-6a175be407f1\") " pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.269924 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.270928 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.273027 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.273411 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.286209 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.342800 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-06ecef5f-e148-418a-a927-77f97664b9ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06ecef5f-e148-418a-a927-77f97664b9ce\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.360816 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.361870 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.364292 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.364682 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.367928 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.413215 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.414245 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.416184 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.422718 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.422886 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.443865 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3605888-b0c3-4049-8f6a-cd4f380b91a7-config\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.443923 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.443948 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.444042 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvzn4\" (UniqueName: \"kubernetes.io/projected/b3605888-b0c3-4049-8f6a-cd4f380b91a7-kube-api-access-xvzn4\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.444107 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2e8486e0-2725-4562-874a-a8a36f02e65a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e8486e0-2725-4562-874a-a8a36f02e65a\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.444338 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.444385 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-06ecef5f-e148-418a-a927-77f97664b9ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06ecef5f-e148-418a-a927-77f97664b9ce\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.444426 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.447647 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.447685 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-06ecef5f-e148-418a-a927-77f97664b9ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06ecef5f-e148-418a-a927-77f97664b9ce\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c9eb1d3a011e7af2d0e565897dbc25965e43c1a4bea15041b5d2658356336c80/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.479608 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-06ecef5f-e148-418a-a927-77f97664b9ce\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06ecef5f-e148-418a-a927-77f97664b9ce\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.480985 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.492546 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546572 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546633 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546658 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-config\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546684 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546725 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546755 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3605888-b0c3-4049-8f6a-cd4f380b91a7-config\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546783 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546804 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546831 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546863 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546888 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb9902a-5fe1-42ee-a659-eebccc3aec15-config\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546928 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lzt\" (UniqueName: \"kubernetes.io/projected/2cb9902a-5fe1-42ee-a659-eebccc3aec15-kube-api-access-68lzt\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546953 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.546974 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.547003 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a406acb1-d9b3-46b5-9297-bf84583d192a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a406acb1-d9b3-46b5-9297-bf84583d192a\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.547027 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.547061 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.547117 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ed927557-1ad3-416a-885d-33009b830d19\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed927557-1ad3-416a-885d-33009b830d19\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.547164 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvzn4\" (UniqueName: \"kubernetes.io/projected/b3605888-b0c3-4049-8f6a-cd4f380b91a7-kube-api-access-xvzn4\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.547188 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xgtx\" (UniqueName: \"kubernetes.io/projected/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-kube-api-access-9xgtx\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.547215 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2e8486e0-2725-4562-874a-a8a36f02e65a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e8486e0-2725-4562-874a-a8a36f02e65a\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.550243 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.550287 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2e8486e0-2725-4562-874a-a8a36f02e65a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e8486e0-2725-4562-874a-a8a36f02e65a\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f21ff46e0cccf1394ebfcd6d08b5381237263e0acb5f3499005b577002275034/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.550520 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3605888-b0c3-4049-8f6a-cd4f380b91a7-config\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.550929 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.551009 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.552757 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.555753 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/b3605888-b0c3-4049-8f6a-cd4f380b91a7-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.567355 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvzn4\" (UniqueName: \"kubernetes.io/projected/b3605888-b0c3-4049-8f6a-cd4f380b91a7-kube-api-access-xvzn4\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.585800 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2e8486e0-2725-4562-874a-a8a36f02e65a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2e8486e0-2725-4562-874a-a8a36f02e65a\") pod \"logging-loki-ingester-0\" (UID: \"b3605888-b0c3-4049-8f6a-cd4f380b91a7\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649218 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ed927557-1ad3-416a-885d-33009b830d19\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed927557-1ad3-416a-885d-33009b830d19\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649331 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xgtx\" (UniqueName: \"kubernetes.io/projected/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-kube-api-access-9xgtx\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649729 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649760 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-config\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649783 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649851 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649892 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649924 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649948 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb9902a-5fe1-42ee-a659-eebccc3aec15-config\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.649988 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68lzt\" (UniqueName: \"kubernetes.io/projected/2cb9902a-5fe1-42ee-a659-eebccc3aec15-kube-api-access-68lzt\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.651595 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.651794 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb9902a-5fe1-42ee-a659-eebccc3aec15-config\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.652099 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.652505 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.652538 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.652568 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a406acb1-d9b3-46b5-9297-bf84583d192a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a406acb1-d9b3-46b5-9297-bf84583d192a\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.652600 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.652755 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-config\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.656455 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.658559 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.658592 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ed927557-1ad3-416a-885d-33009b830d19\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed927557-1ad3-416a-885d-33009b830d19\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/575151a99aa8257e6030b1ba8770c9171521934c8cf83d5c7cd7603da5ef5b63/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.664459 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.665631 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.665758 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.666566 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.667367 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/2cb9902a-5fe1-42ee-a659-eebccc3aec15-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.668887 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xgtx\" (UniqueName: \"kubernetes.io/projected/05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e-kube-api-access-9xgtx\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.670643 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lzt\" (UniqueName: \"kubernetes.io/projected/2cb9902a-5fe1-42ee-a659-eebccc3aec15-kube-api-access-68lzt\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.670806 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.670832 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a406acb1-d9b3-46b5-9297-bf84583d192a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a406acb1-d9b3-46b5-9297-bf84583d192a\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9bd2a3d38f0af1e1aa0b202c6a67c0c676ffcc4415aa72f91ba06ab5d848d79c/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.713716 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ed927557-1ad3-416a-885d-33009b830d19\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ed927557-1ad3-416a-885d-33009b830d19\") pod \"logging-loki-compactor-0\" (UID: \"2cb9902a-5fe1-42ee-a659-eebccc3aec15\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.782565 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a406acb1-d9b3-46b5-9297-bf84583d192a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a406acb1-d9b3-46b5-9297-bf84583d192a\") pod \"logging-loki-index-gateway-0\" (UID: \"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.816230 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5f86bf5685-lsthj"] Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.837262 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" event={"ID":"98490098-f31f-4ee3-9f15-ee37b8740035","Type":"ContainerStarted","Data":"fb8e6e345bdefed548652ef504a4b3699207acbcab99af43a7672fc56787ee01"} Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.838808 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" event={"ID":"6bbaf0c4-9bfc-4cf9-b238-4f494e492243","Type":"ContainerStarted","Data":"4711b7eb4a0da7212314f1f32d2fe079233728253146fbb3397169eec06f789c"} Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.844058 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" event={"ID":"93d83a8b-3334-43f3-b417-58a7fbd7282c","Type":"ContainerStarted","Data":"9223b60892b923767e4ceb8693db69ab690eda2f495ec0aee3e1d718b91f663d"} Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.868129 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" event={"ID":"7a392a7d-824d-420d-bf0d-66ca95134ea6","Type":"ContainerStarted","Data":"ca0113198309a389da25790ee60a87a610d64b679357d9ae5ad4f1e3342fca45"} Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.868251 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2j8r5" podUID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerName="registry-server" containerID="cri-o://489cc6a3cbe57f3f7a1a1c8c11ef02d4ab91eb4ecc18fe6c0a2d47b0318d8936" gracePeriod=2 Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.884230 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-5f86bf5685-p62nr"] Feb 02 13:14:45 crc kubenswrapper[4721]: I0202 13:14:45.884481 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:45 crc kubenswrapper[4721]: W0202 13:14:45.895565 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a2094f_7b9c_426c_b7ea_6a175be407f1.slice/crio-91aeae4ccd700b2348caa4d837157cb32f19a714b3ae997f4aac1d12ed759a3a WatchSource:0}: Error finding container 91aeae4ccd700b2348caa4d837157cb32f19a714b3ae997f4aac1d12ed759a3a: Status 404 returned error can't find the container with id 91aeae4ccd700b2348caa4d837157cb32f19a714b3ae997f4aac1d12ed759a3a Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.014677 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.038234 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.298457 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Feb 02 13:14:46 crc kubenswrapper[4721]: W0202 13:14:46.311102 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3605888_b0c3_4049_8f6a_cd4f380b91a7.slice/crio-6084a3489789763a95ef4ea2a853ec9afb5812a4c9323b9c401d41a3bc378eb5 WatchSource:0}: Error finding container 6084a3489789763a95ef4ea2a853ec9afb5812a4c9323b9c401d41a3bc378eb5: Status 404 returned error can't find the container with id 6084a3489789763a95ef4ea2a853ec9afb5812a4c9323b9c401d41a3bc378eb5 Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.508587 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Feb 02 13:14:46 crc kubenswrapper[4721]: W0202 13:14:46.513693 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05f7eb9f_7ce9_4d66_b8e1_cc9eb0c1949e.slice/crio-5bf6122aacafc0b6388120a083d8b2971b4a41f1758cdb149ab8f88b8da297bd WatchSource:0}: Error finding container 5bf6122aacafc0b6388120a083d8b2971b4a41f1758cdb149ab8f88b8da297bd: Status 404 returned error can't find the container with id 5bf6122aacafc0b6388120a083d8b2971b4a41f1758cdb149ab8f88b8da297bd Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.514236 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.875564 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"b3605888-b0c3-4049-8f6a-cd4f380b91a7","Type":"ContainerStarted","Data":"6084a3489789763a95ef4ea2a853ec9afb5812a4c9323b9c401d41a3bc378eb5"} Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.876899 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" event={"ID":"e0a2094f-7b9c-426c-b7ea-6a175be407f1","Type":"ContainerStarted","Data":"91aeae4ccd700b2348caa4d837157cb32f19a714b3ae997f4aac1d12ed759a3a"} Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.878343 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e","Type":"ContainerStarted","Data":"5bf6122aacafc0b6388120a083d8b2971b4a41f1758cdb149ab8f88b8da297bd"} Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.882001 4721 generic.go:334] "Generic (PLEG): container finished" podID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerID="489cc6a3cbe57f3f7a1a1c8c11ef02d4ab91eb4ecc18fe6c0a2d47b0318d8936" exitCode=0 Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.882052 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j8r5" event={"ID":"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5","Type":"ContainerDied","Data":"489cc6a3cbe57f3f7a1a1c8c11ef02d4ab91eb4ecc18fe6c0a2d47b0318d8936"} Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.882970 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"2cb9902a-5fe1-42ee-a659-eebccc3aec15","Type":"ContainerStarted","Data":"b05028a727fbcccbebb99f52d6adbbba29cde94cac69a952fa8f40449a35b940"} Feb 02 13:14:46 crc kubenswrapper[4721]: I0202 13:14:46.992106 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.187996 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-catalog-content\") pod \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.188111 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-utilities\") pod \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.188175 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p27h\" (UniqueName: \"kubernetes.io/projected/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-kube-api-access-4p27h\") pod \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\" (UID: \"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5\") " Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.189432 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-utilities" (OuterVolumeSpecName: "utilities") pod "2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" (UID: "2c09ae42-5e47-43fd-bcdb-b843f0a80cf5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.205046 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-kube-api-access-4p27h" (OuterVolumeSpecName: "kube-api-access-4p27h") pod "2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" (UID: "2c09ae42-5e47-43fd-bcdb-b843f0a80cf5"). InnerVolumeSpecName "kube-api-access-4p27h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.217966 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" (UID: "2c09ae42-5e47-43fd-bcdb-b843f0a80cf5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.290092 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.290126 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.290136 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p27h\" (UniqueName: \"kubernetes.io/projected/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5-kube-api-access-4p27h\") on node \"crc\" DevicePath \"\"" Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.894156 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2j8r5" event={"ID":"2c09ae42-5e47-43fd-bcdb-b843f0a80cf5","Type":"ContainerDied","Data":"c4a7acf51223f45f628cd91f41481e7aede65390c1e3de98544e03d9db406fc1"} Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.894215 4721 scope.go:117] "RemoveContainer" containerID="489cc6a3cbe57f3f7a1a1c8c11ef02d4ab91eb4ecc18fe6c0a2d47b0318d8936" Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.894352 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2j8r5" Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.927866 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j8r5"] Feb 02 13:14:47 crc kubenswrapper[4721]: I0202 13:14:47.937350 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2j8r5"] Feb 02 13:14:48 crc kubenswrapper[4721]: I0202 13:14:48.418995 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" path="/var/lib/kubelet/pods/2c09ae42-5e47-43fd-bcdb-b843f0a80cf5/volumes" Feb 02 13:14:48 crc kubenswrapper[4721]: I0202 13:14:48.610457 4721 scope.go:117] "RemoveContainer" containerID="e28c5c9633828095d9e468bec9d5dca15024d1f9adc02fe17ec54b56e17309d6" Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.517448 4721 scope.go:117] "RemoveContainer" containerID="257023ea900b686319d684f93bf86174efc5e0727f68b5296e819f6673afb9f7" Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.915698 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" event={"ID":"98490098-f31f-4ee3-9f15-ee37b8740035","Type":"ContainerStarted","Data":"4086d3da4ad2c2d21614ee02f6784e740d57e601d320d13ca9a8bea09c5f5a57"} Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.916314 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.920617 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" event={"ID":"6bbaf0c4-9bfc-4cf9-b238-4f494e492243","Type":"ContainerStarted","Data":"cea591203469d67d8ac27cf0aea109b32f50ca2717610bec7ec2894317741a5a"} Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.935726 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" podStartSLOduration=1.403821258 podStartE2EDuration="5.935710739s" podCreationTimestamp="2026-02-02 13:14:44 +0000 UTC" firstStartedPulling="2026-02-02 13:14:45.086236746 +0000 UTC m=+825.388751135" lastFinishedPulling="2026-02-02 13:14:49.618126227 +0000 UTC m=+829.920640616" observedRunningTime="2026-02-02 13:14:49.932663418 +0000 UTC m=+830.235177827" watchObservedRunningTime="2026-02-02 13:14:49.935710739 +0000 UTC m=+830.238225128" Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.948162 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"b3605888-b0c3-4049-8f6a-cd4f380b91a7","Type":"ContainerStarted","Data":"3ea5bbcf7382aaf12d7111cd0592cf9fa3971e14a3ad042db238a1b506c5d58b"} Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.948248 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.957954 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" event={"ID":"93d83a8b-3334-43f3-b417-58a7fbd7282c","Type":"ContainerStarted","Data":"900c298808bd159f79b6e9efb1aab9db8bb0cbfd9bf23be67e6fefdae5083d1e"} Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.958690 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:14:49 crc kubenswrapper[4721]: I0202 13:14:49.974056 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=2.67938469 podStartE2EDuration="5.97403792s" podCreationTimestamp="2026-02-02 13:14:44 +0000 UTC" firstStartedPulling="2026-02-02 13:14:46.313597776 +0000 UTC m=+826.616112165" lastFinishedPulling="2026-02-02 13:14:49.608251006 +0000 UTC m=+829.910765395" observedRunningTime="2026-02-02 13:14:49.970030735 +0000 UTC m=+830.272545124" watchObservedRunningTime="2026-02-02 13:14:49.97403792 +0000 UTC m=+830.276552309" Feb 02 13:14:50 crc kubenswrapper[4721]: I0202 13:14:50.000580 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" podStartSLOduration=1.64307143 podStartE2EDuration="6.00056389s" podCreationTimestamp="2026-02-02 13:14:44 +0000 UTC" firstStartedPulling="2026-02-02 13:14:45.213890124 +0000 UTC m=+825.516404513" lastFinishedPulling="2026-02-02 13:14:49.571382584 +0000 UTC m=+829.873896973" observedRunningTime="2026-02-02 13:14:49.999179543 +0000 UTC m=+830.301693932" watchObservedRunningTime="2026-02-02 13:14:50.00056389 +0000 UTC m=+830.303078279" Feb 02 13:14:50 crc kubenswrapper[4721]: I0202 13:14:50.982628 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" event={"ID":"e0a2094f-7b9c-426c-b7ea-6a175be407f1","Type":"ContainerStarted","Data":"52c56b9b4aad7d9613498e92d74085c1dd666eecd51d58b1ef0594a883c327fd"} Feb 02 13:14:50 crc kubenswrapper[4721]: I0202 13:14:50.984563 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e","Type":"ContainerStarted","Data":"06d760a66be361fb8a9a24ac09b04fea7ab602f9b48628062068683420d1b72d"} Feb 02 13:14:50 crc kubenswrapper[4721]: I0202 13:14:50.984647 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:14:50 crc kubenswrapper[4721]: I0202 13:14:50.985785 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" event={"ID":"7a392a7d-824d-420d-bf0d-66ca95134ea6","Type":"ContainerStarted","Data":"6b1866add1234c812d9728a8f46d75b7307fee4f54c3173bc8d783d1d450e86e"} Feb 02 13:14:50 crc kubenswrapper[4721]: I0202 13:14:50.985886 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:14:50 crc kubenswrapper[4721]: I0202 13:14:50.987786 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"2cb9902a-5fe1-42ee-a659-eebccc3aec15","Type":"ContainerStarted","Data":"ebdb48cc8ece068cb66b383eb389088a29273b254af0e3c7749c23b159db4cce"} Feb 02 13:14:51 crc kubenswrapper[4721]: I0202 13:14:51.023049 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.923040222 podStartE2EDuration="7.023027805s" podCreationTimestamp="2026-02-02 13:14:44 +0000 UTC" firstStartedPulling="2026-02-02 13:14:46.518944826 +0000 UTC m=+826.821459215" lastFinishedPulling="2026-02-02 13:14:49.618932409 +0000 UTC m=+829.921446798" observedRunningTime="2026-02-02 13:14:51.003263273 +0000 UTC m=+831.305777662" watchObservedRunningTime="2026-02-02 13:14:51.023027805 +0000 UTC m=+831.325542204" Feb 02 13:14:51 crc kubenswrapper[4721]: I0202 13:14:51.023196 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.885224034 podStartE2EDuration="7.023190179s" podCreationTimestamp="2026-02-02 13:14:44 +0000 UTC" firstStartedPulling="2026-02-02 13:14:46.51228849 +0000 UTC m=+826.814802879" lastFinishedPulling="2026-02-02 13:14:49.650254625 +0000 UTC m=+829.952769024" observedRunningTime="2026-02-02 13:14:51.02060617 +0000 UTC m=+831.323120569" watchObservedRunningTime="2026-02-02 13:14:51.023190179 +0000 UTC m=+831.325704588" Feb 02 13:14:51 crc kubenswrapper[4721]: I0202 13:14:51.039362 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" podStartSLOduration=2.371687269 podStartE2EDuration="7.039344405s" podCreationTimestamp="2026-02-02 13:14:44 +0000 UTC" firstStartedPulling="2026-02-02 13:14:44.98601068 +0000 UTC m=+825.288525069" lastFinishedPulling="2026-02-02 13:14:49.653667816 +0000 UTC m=+829.956182205" observedRunningTime="2026-02-02 13:14:51.037979679 +0000 UTC m=+831.340494088" watchObservedRunningTime="2026-02-02 13:14:51.039344405 +0000 UTC m=+831.341858794" Feb 02 13:14:51 crc kubenswrapper[4721]: I0202 13:14:51.993741 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.019013 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" event={"ID":"6bbaf0c4-9bfc-4cf9-b238-4f494e492243","Type":"ContainerStarted","Data":"69ac97b235683e1115e495b613194b431973148a4f0ab3c4a1f9129c8c0097aa"} Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.019598 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.019774 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.022394 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" event={"ID":"e0a2094f-7b9c-426c-b7ea-6a175be407f1","Type":"ContainerStarted","Data":"f3be0acc575a5a61840ab054e14736571b1a91aa8208b670d6c569c7f42d8749"} Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.023023 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.023278 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.030710 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.033753 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.034202 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.035697 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.071242 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5f86bf5685-lsthj" podStartSLOduration=2.681433814 podStartE2EDuration="11.071210401s" podCreationTimestamp="2026-02-02 13:14:44 +0000 UTC" firstStartedPulling="2026-02-02 13:14:45.835618842 +0000 UTC m=+826.138133231" lastFinishedPulling="2026-02-02 13:14:54.225395429 +0000 UTC m=+834.527909818" observedRunningTime="2026-02-02 13:14:55.052139497 +0000 UTC m=+835.354653906" watchObservedRunningTime="2026-02-02 13:14:55.071210401 +0000 UTC m=+835.373724840" Feb 02 13:14:55 crc kubenswrapper[4721]: I0202 13:14:55.125090 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-5f86bf5685-p62nr" podStartSLOduration=2.794109037 podStartE2EDuration="11.125045301s" podCreationTimestamp="2026-02-02 13:14:44 +0000 UTC" firstStartedPulling="2026-02-02 13:14:45.898283426 +0000 UTC m=+826.200797815" lastFinishedPulling="2026-02-02 13:14:54.22921969 +0000 UTC m=+834.531734079" observedRunningTime="2026-02-02 13:14:55.117550994 +0000 UTC m=+835.420065403" watchObservedRunningTime="2026-02-02 13:14:55.125045301 +0000 UTC m=+835.427559720" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.204960 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn"] Feb 02 13:15:00 crc kubenswrapper[4721]: E0202 13:15:00.205660 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerName="extract-content" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.205679 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerName="extract-content" Feb 02 13:15:00 crc kubenswrapper[4721]: E0202 13:15:00.205709 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerName="extract-utilities" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.205717 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerName="extract-utilities" Feb 02 13:15:00 crc kubenswrapper[4721]: E0202 13:15:00.205738 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerName="registry-server" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.205745 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerName="registry-server" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.205986 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c09ae42-5e47-43fd-bcdb-b843f0a80cf5" containerName="registry-server" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.206583 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.210562 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.212582 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.222079 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn"] Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.289520 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c984cb-b059-4e3f-86f2-8abca8e6942e-secret-volume\") pod \"collect-profiles-29500635-qjbdn\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.289633 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c984cb-b059-4e3f-86f2-8abca8e6942e-config-volume\") pod \"collect-profiles-29500635-qjbdn\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.289684 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x2fq\" (UniqueName: \"kubernetes.io/projected/13c984cb-b059-4e3f-86f2-8abca8e6942e-kube-api-access-4x2fq\") pod \"collect-profiles-29500635-qjbdn\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.390491 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c984cb-b059-4e3f-86f2-8abca8e6942e-config-volume\") pod \"collect-profiles-29500635-qjbdn\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.390548 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x2fq\" (UniqueName: \"kubernetes.io/projected/13c984cb-b059-4e3f-86f2-8abca8e6942e-kube-api-access-4x2fq\") pod \"collect-profiles-29500635-qjbdn\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.390637 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c984cb-b059-4e3f-86f2-8abca8e6942e-secret-volume\") pod \"collect-profiles-29500635-qjbdn\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.392536 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.397393 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c984cb-b059-4e3f-86f2-8abca8e6942e-secret-volume\") pod \"collect-profiles-29500635-qjbdn\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.401740 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c984cb-b059-4e3f-86f2-8abca8e6942e-config-volume\") pod \"collect-profiles-29500635-qjbdn\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.413045 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x2fq\" (UniqueName: \"kubernetes.io/projected/13c984cb-b059-4e3f-86f2-8abca8e6942e-kube-api-access-4x2fq\") pod \"collect-profiles-29500635-qjbdn\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.525810 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.535123 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:00 crc kubenswrapper[4721]: I0202 13:15:00.751862 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn"] Feb 02 13:15:01 crc kubenswrapper[4721]: I0202 13:15:01.065496 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" event={"ID":"13c984cb-b059-4e3f-86f2-8abca8e6942e","Type":"ContainerStarted","Data":"dbd339a45a88197a4052721f67969cee0f84e4e520076f5a19a1a3e14ab9298f"} Feb 02 13:15:01 crc kubenswrapper[4721]: I0202 13:15:01.065540 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" event={"ID":"13c984cb-b059-4e3f-86f2-8abca8e6942e","Type":"ContainerStarted","Data":"1e4acb7e64b443ef71b0b41d513ba87644d053deb453fd5009c15092f6438056"} Feb 02 13:15:01 crc kubenswrapper[4721]: I0202 13:15:01.080547 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" podStartSLOduration=1.080531614 podStartE2EDuration="1.080531614s" podCreationTimestamp="2026-02-02 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:15:01.077224796 +0000 UTC m=+841.379739185" watchObservedRunningTime="2026-02-02 13:15:01.080531614 +0000 UTC m=+841.383046003" Feb 02 13:15:02 crc kubenswrapper[4721]: I0202 13:15:02.074176 4721 generic.go:334] "Generic (PLEG): container finished" podID="13c984cb-b059-4e3f-86f2-8abca8e6942e" containerID="dbd339a45a88197a4052721f67969cee0f84e4e520076f5a19a1a3e14ab9298f" exitCode=0 Feb 02 13:15:02 crc kubenswrapper[4721]: I0202 13:15:02.074294 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" event={"ID":"13c984cb-b059-4e3f-86f2-8abca8e6942e","Type":"ContainerDied","Data":"dbd339a45a88197a4052721f67969cee0f84e4e520076f5a19a1a3e14ab9298f"} Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.333669 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.434818 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c984cb-b059-4e3f-86f2-8abca8e6942e-config-volume\") pod \"13c984cb-b059-4e3f-86f2-8abca8e6942e\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.434968 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c984cb-b059-4e3f-86f2-8abca8e6942e-secret-volume\") pod \"13c984cb-b059-4e3f-86f2-8abca8e6942e\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.435038 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x2fq\" (UniqueName: \"kubernetes.io/projected/13c984cb-b059-4e3f-86f2-8abca8e6942e-kube-api-access-4x2fq\") pod \"13c984cb-b059-4e3f-86f2-8abca8e6942e\" (UID: \"13c984cb-b059-4e3f-86f2-8abca8e6942e\") " Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.436008 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13c984cb-b059-4e3f-86f2-8abca8e6942e-config-volume" (OuterVolumeSpecName: "config-volume") pod "13c984cb-b059-4e3f-86f2-8abca8e6942e" (UID: "13c984cb-b059-4e3f-86f2-8abca8e6942e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.440054 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c984cb-b059-4e3f-86f2-8abca8e6942e-kube-api-access-4x2fq" (OuterVolumeSpecName: "kube-api-access-4x2fq") pod "13c984cb-b059-4e3f-86f2-8abca8e6942e" (UID: "13c984cb-b059-4e3f-86f2-8abca8e6942e"). InnerVolumeSpecName "kube-api-access-4x2fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.440734 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c984cb-b059-4e3f-86f2-8abca8e6942e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "13c984cb-b059-4e3f-86f2-8abca8e6942e" (UID: "13c984cb-b059-4e3f-86f2-8abca8e6942e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.536703 4721 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/13c984cb-b059-4e3f-86f2-8abca8e6942e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.536751 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x2fq\" (UniqueName: \"kubernetes.io/projected/13c984cb-b059-4e3f-86f2-8abca8e6942e-kube-api-access-4x2fq\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:03 crc kubenswrapper[4721]: I0202 13:15:03.536765 4721 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13c984cb-b059-4e3f-86f2-8abca8e6942e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:04 crc kubenswrapper[4721]: I0202 13:15:04.089679 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" event={"ID":"13c984cb-b059-4e3f-86f2-8abca8e6942e","Type":"ContainerDied","Data":"1e4acb7e64b443ef71b0b41d513ba87644d053deb453fd5009c15092f6438056"} Feb 02 13:15:04 crc kubenswrapper[4721]: I0202 13:15:04.089735 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e4acb7e64b443ef71b0b41d513ba87644d053deb453fd5009c15092f6438056" Feb 02 13:15:04 crc kubenswrapper[4721]: I0202 13:15:04.089743 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn" Feb 02 13:15:04 crc kubenswrapper[4721]: I0202 13:15:04.452102 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-tb6gs" Feb 02 13:15:04 crc kubenswrapper[4721]: I0202 13:15:04.609349 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76788598db-mnp7c" Feb 02 13:15:04 crc kubenswrapper[4721]: I0202 13:15:04.798153 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-69d9546745-xs62z" Feb 02 13:15:05 crc kubenswrapper[4721]: I0202 13:15:05.890316 4721 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Feb 02 13:15:05 crc kubenswrapper[4721]: I0202 13:15:05.890658 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b3605888-b0c3-4049-8f6a-cd4f380b91a7" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 13:15:06 crc kubenswrapper[4721]: I0202 13:15:06.022578 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Feb 02 13:15:06 crc kubenswrapper[4721]: I0202 13:15:06.050921 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 13:15:14 crc kubenswrapper[4721]: I0202 13:15:14.763637 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:15:14 crc kubenswrapper[4721]: I0202 13:15:14.764275 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:15:14 crc kubenswrapper[4721]: I0202 13:15:14.764345 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:15:14 crc kubenswrapper[4721]: I0202 13:15:14.765358 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e271bf7e19d8205d47335a427c173d1e8d60e0f2a6167b224679306973cc1cc"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:15:14 crc kubenswrapper[4721]: I0202 13:15:14.765505 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://4e271bf7e19d8205d47335a427c173d1e8d60e0f2a6167b224679306973cc1cc" gracePeriod=600 Feb 02 13:15:15 crc kubenswrapper[4721]: I0202 13:15:15.188642 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="4e271bf7e19d8205d47335a427c173d1e8d60e0f2a6167b224679306973cc1cc" exitCode=0 Feb 02 13:15:15 crc kubenswrapper[4721]: I0202 13:15:15.188715 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"4e271bf7e19d8205d47335a427c173d1e8d60e0f2a6167b224679306973cc1cc"} Feb 02 13:15:15 crc kubenswrapper[4721]: I0202 13:15:15.188987 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"3e250ca33160c82bc83b5c1d01cc482ebd55cdb3c1b9ae291d6af786cb617e66"} Feb 02 13:15:15 crc kubenswrapper[4721]: I0202 13:15:15.189017 4721 scope.go:117] "RemoveContainer" containerID="014ac4f70cadb2e5ed3977a4b883172b9e9190b5cdf25295500702abdd38ede7" Feb 02 13:15:15 crc kubenswrapper[4721]: I0202 13:15:15.892274 4721 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Feb 02 13:15:15 crc kubenswrapper[4721]: I0202 13:15:15.892635 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b3605888-b0c3-4049-8f6a-cd4f380b91a7" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 13:15:25 crc kubenswrapper[4721]: I0202 13:15:25.889773 4721 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Feb 02 13:15:25 crc kubenswrapper[4721]: I0202 13:15:25.890656 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b3605888-b0c3-4049-8f6a-cd4f380b91a7" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 13:15:31 crc kubenswrapper[4721]: I0202 13:15:31.865125 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tgnrr"] Feb 02 13:15:31 crc kubenswrapper[4721]: E0202 13:15:31.866170 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c984cb-b059-4e3f-86f2-8abca8e6942e" containerName="collect-profiles" Feb 02 13:15:31 crc kubenswrapper[4721]: I0202 13:15:31.866192 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c984cb-b059-4e3f-86f2-8abca8e6942e" containerName="collect-profiles" Feb 02 13:15:31 crc kubenswrapper[4721]: I0202 13:15:31.866431 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c984cb-b059-4e3f-86f2-8abca8e6942e" containerName="collect-profiles" Feb 02 13:15:31 crc kubenswrapper[4721]: I0202 13:15:31.868221 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:31 crc kubenswrapper[4721]: I0202 13:15:31.876587 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tgnrr"] Feb 02 13:15:31 crc kubenswrapper[4721]: I0202 13:15:31.984915 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlcrv\" (UniqueName: \"kubernetes.io/projected/ff82afec-f54e-4b47-8399-fd27b44558d3-kube-api-access-rlcrv\") pod \"certified-operators-tgnrr\" (UID: \"ff82afec-f54e-4b47-8399-fd27b44558d3\") " pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:31 crc kubenswrapper[4721]: I0202 13:15:31.985257 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff82afec-f54e-4b47-8399-fd27b44558d3-catalog-content\") pod \"certified-operators-tgnrr\" (UID: \"ff82afec-f54e-4b47-8399-fd27b44558d3\") " pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:31 crc kubenswrapper[4721]: I0202 13:15:31.985329 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff82afec-f54e-4b47-8399-fd27b44558d3-utilities\") pod \"certified-operators-tgnrr\" (UID: \"ff82afec-f54e-4b47-8399-fd27b44558d3\") " pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:32 crc kubenswrapper[4721]: I0202 13:15:32.086656 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff82afec-f54e-4b47-8399-fd27b44558d3-catalog-content\") pod \"certified-operators-tgnrr\" (UID: \"ff82afec-f54e-4b47-8399-fd27b44558d3\") " pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:32 crc kubenswrapper[4721]: I0202 13:15:32.086944 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff82afec-f54e-4b47-8399-fd27b44558d3-utilities\") pod \"certified-operators-tgnrr\" (UID: \"ff82afec-f54e-4b47-8399-fd27b44558d3\") " pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:32 crc kubenswrapper[4721]: I0202 13:15:32.087107 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlcrv\" (UniqueName: \"kubernetes.io/projected/ff82afec-f54e-4b47-8399-fd27b44558d3-kube-api-access-rlcrv\") pod \"certified-operators-tgnrr\" (UID: \"ff82afec-f54e-4b47-8399-fd27b44558d3\") " pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:32 crc kubenswrapper[4721]: I0202 13:15:32.087241 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff82afec-f54e-4b47-8399-fd27b44558d3-catalog-content\") pod \"certified-operators-tgnrr\" (UID: \"ff82afec-f54e-4b47-8399-fd27b44558d3\") " pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:32 crc kubenswrapper[4721]: I0202 13:15:32.087392 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff82afec-f54e-4b47-8399-fd27b44558d3-utilities\") pod \"certified-operators-tgnrr\" (UID: \"ff82afec-f54e-4b47-8399-fd27b44558d3\") " pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:32 crc kubenswrapper[4721]: I0202 13:15:32.112014 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlcrv\" (UniqueName: \"kubernetes.io/projected/ff82afec-f54e-4b47-8399-fd27b44558d3-kube-api-access-rlcrv\") pod \"certified-operators-tgnrr\" (UID: \"ff82afec-f54e-4b47-8399-fd27b44558d3\") " pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:32 crc kubenswrapper[4721]: I0202 13:15:32.187499 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:32 crc kubenswrapper[4721]: I0202 13:15:32.699746 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tgnrr"] Feb 02 13:15:33 crc kubenswrapper[4721]: I0202 13:15:33.302838 4721 generic.go:334] "Generic (PLEG): container finished" podID="ff82afec-f54e-4b47-8399-fd27b44558d3" containerID="e919ad3046080582c210b132a34bd292a9604a038e68ce187c8ada2cb5b4e644" exitCode=0 Feb 02 13:15:33 crc kubenswrapper[4721]: I0202 13:15:33.302901 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgnrr" event={"ID":"ff82afec-f54e-4b47-8399-fd27b44558d3","Type":"ContainerDied","Data":"e919ad3046080582c210b132a34bd292a9604a038e68ce187c8ada2cb5b4e644"} Feb 02 13:15:33 crc kubenswrapper[4721]: I0202 13:15:33.302981 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgnrr" event={"ID":"ff82afec-f54e-4b47-8399-fd27b44558d3","Type":"ContainerStarted","Data":"a199694561ebeb611256354bee65dd727d981486d3a78331a7d6bc24c71ceb99"} Feb 02 13:15:35 crc kubenswrapper[4721]: I0202 13:15:35.889983 4721 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Feb 02 13:15:35 crc kubenswrapper[4721]: I0202 13:15:35.890617 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="b3605888-b0c3-4049-8f6a-cd4f380b91a7" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 13:15:38 crc kubenswrapper[4721]: I0202 13:15:38.355881 4721 generic.go:334] "Generic (PLEG): container finished" podID="ff82afec-f54e-4b47-8399-fd27b44558d3" containerID="0bef98b1e102f71b642c054d20c427193cff60dddb3067aac1de670786f343f9" exitCode=0 Feb 02 13:15:38 crc kubenswrapper[4721]: I0202 13:15:38.355949 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgnrr" event={"ID":"ff82afec-f54e-4b47-8399-fd27b44558d3","Type":"ContainerDied","Data":"0bef98b1e102f71b642c054d20c427193cff60dddb3067aac1de670786f343f9"} Feb 02 13:15:39 crc kubenswrapper[4721]: I0202 13:15:39.364642 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tgnrr" event={"ID":"ff82afec-f54e-4b47-8399-fd27b44558d3","Type":"ContainerStarted","Data":"c0f7d57133aed0dfb5f80491de3135629651a4698d8c515701f30eb672053a40"} Feb 02 13:15:39 crc kubenswrapper[4721]: I0202 13:15:39.386675 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tgnrr" podStartSLOduration=2.735167169 podStartE2EDuration="8.386653191s" podCreationTimestamp="2026-02-02 13:15:31 +0000 UTC" firstStartedPulling="2026-02-02 13:15:33.304608444 +0000 UTC m=+873.607122833" lastFinishedPulling="2026-02-02 13:15:38.956094466 +0000 UTC m=+879.258608855" observedRunningTime="2026-02-02 13:15:39.382777118 +0000 UTC m=+879.685291517" watchObservedRunningTime="2026-02-02 13:15:39.386653191 +0000 UTC m=+879.689167590" Feb 02 13:15:42 crc kubenswrapper[4721]: I0202 13:15:42.188001 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:42 crc kubenswrapper[4721]: I0202 13:15:42.188570 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:42 crc kubenswrapper[4721]: I0202 13:15:42.244206 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:45 crc kubenswrapper[4721]: I0202 13:15:45.893372 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.229320 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tgnrr" Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.313854 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tgnrr"] Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.362696 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5wlg"] Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.362938 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w5wlg" podUID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerName="registry-server" containerID="cri-o://fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87" gracePeriod=2 Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.747655 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.811677 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-utilities\") pod \"2db39b59-16bf-4029-b8be-4be395b09cdf\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.811859 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9j2g\" (UniqueName: \"kubernetes.io/projected/2db39b59-16bf-4029-b8be-4be395b09cdf-kube-api-access-c9j2g\") pod \"2db39b59-16bf-4029-b8be-4be395b09cdf\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.811937 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-catalog-content\") pod \"2db39b59-16bf-4029-b8be-4be395b09cdf\" (UID: \"2db39b59-16bf-4029-b8be-4be395b09cdf\") " Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.817964 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-utilities" (OuterVolumeSpecName: "utilities") pod "2db39b59-16bf-4029-b8be-4be395b09cdf" (UID: "2db39b59-16bf-4029-b8be-4be395b09cdf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.823506 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db39b59-16bf-4029-b8be-4be395b09cdf-kube-api-access-c9j2g" (OuterVolumeSpecName: "kube-api-access-c9j2g") pod "2db39b59-16bf-4029-b8be-4be395b09cdf" (UID: "2db39b59-16bf-4029-b8be-4be395b09cdf"). InnerVolumeSpecName "kube-api-access-c9j2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.885025 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2db39b59-16bf-4029-b8be-4be395b09cdf" (UID: "2db39b59-16bf-4029-b8be-4be395b09cdf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.914349 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.914402 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db39b59-16bf-4029-b8be-4be395b09cdf-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:52 crc kubenswrapper[4721]: I0202 13:15:52.914416 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9j2g\" (UniqueName: \"kubernetes.io/projected/2db39b59-16bf-4029-b8be-4be395b09cdf-kube-api-access-c9j2g\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.452812 4721 generic.go:334] "Generic (PLEG): container finished" podID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerID="fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87" exitCode=0 Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.452872 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5wlg" event={"ID":"2db39b59-16bf-4029-b8be-4be395b09cdf","Type":"ContainerDied","Data":"fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87"} Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.453167 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5wlg" event={"ID":"2db39b59-16bf-4029-b8be-4be395b09cdf","Type":"ContainerDied","Data":"aa53f668db48e2aa112a5cbfcc3d2601a92bffa612a49b3105e8618823c15e6b"} Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.453190 4721 scope.go:117] "RemoveContainer" containerID="fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.452886 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5wlg" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.472500 4721 scope.go:117] "RemoveContainer" containerID="6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.489633 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5wlg"] Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.505690 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w5wlg"] Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.517511 4721 scope.go:117] "RemoveContainer" containerID="cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.549946 4721 scope.go:117] "RemoveContainer" containerID="fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87" Feb 02 13:15:53 crc kubenswrapper[4721]: E0202 13:15:53.576550 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87\": container with ID starting with fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87 not found: ID does not exist" containerID="fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.576644 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87"} err="failed to get container status \"fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87\": rpc error: code = NotFound desc = could not find container \"fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87\": container with ID starting with fac2c6657c2c4b2c8f188a592777004421506c7d4a56916a7a265847528a9b87 not found: ID does not exist" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.576709 4721 scope.go:117] "RemoveContainer" containerID="6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f" Feb 02 13:15:53 crc kubenswrapper[4721]: E0202 13:15:53.577289 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f\": container with ID starting with 6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f not found: ID does not exist" containerID="6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.577350 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f"} err="failed to get container status \"6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f\": rpc error: code = NotFound desc = could not find container \"6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f\": container with ID starting with 6a9b786e3b23e3aaade71c1314045316e2616b985dc63d18d872102a5103829f not found: ID does not exist" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.577384 4721 scope.go:117] "RemoveContainer" containerID="cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc" Feb 02 13:15:53 crc kubenswrapper[4721]: E0202 13:15:53.577666 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc\": container with ID starting with cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc not found: ID does not exist" containerID="cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc" Feb 02 13:15:53 crc kubenswrapper[4721]: I0202 13:15:53.577694 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc"} err="failed to get container status \"cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc\": rpc error: code = NotFound desc = could not find container \"cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc\": container with ID starting with cda228ff8b653e386af95f32890f0fe9189a306544ff456cd8d4ee6c401233dc not found: ID does not exist" Feb 02 13:15:54 crc kubenswrapper[4721]: I0202 13:15:54.417567 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db39b59-16bf-4029-b8be-4be395b09cdf" path="/var/lib/kubelet/pods/2db39b59-16bf-4029-b8be-4be395b09cdf/volumes" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.333519 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-x8xcn"] Feb 02 13:16:02 crc kubenswrapper[4721]: E0202 13:16:02.334316 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerName="registry-server" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.334329 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerName="registry-server" Feb 02 13:16:02 crc kubenswrapper[4721]: E0202 13:16:02.334359 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerName="extract-utilities" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.334365 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerName="extract-utilities" Feb 02 13:16:02 crc kubenswrapper[4721]: E0202 13:16:02.334372 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerName="extract-content" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.334378 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerName="extract-content" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.334496 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db39b59-16bf-4029-b8be-4be395b09cdf" containerName="registry-server" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.335085 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.338489 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.338488 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.339028 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.339227 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.339255 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-vwh88" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.347916 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-x8xcn"] Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.348631 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.406320 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-x8xcn"] Feb 02 13:16:02 crc kubenswrapper[4721]: E0202 13:16:02.406837 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-8wznf metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-8wznf metrics sa-token tmp trusted-ca]: context canceled" pod="openshift-logging/collector-x8xcn" podUID="107cfbba-9034-4f35-adf0-801a876e9d52" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467596 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config-openshift-service-cacrt\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467694 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-syslog-receiver\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467732 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-trusted-ca\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467770 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/107cfbba-9034-4f35-adf0-801a876e9d52-tmp\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467793 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-token\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467839 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467859 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-sa-token\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467887 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/107cfbba-9034-4f35-adf0-801a876e9d52-datadir\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467910 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wznf\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-kube-api-access-8wznf\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.467972 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-entrypoint\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.468014 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-metrics\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.510881 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.521266 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569357 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-metrics\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569431 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config-openshift-service-cacrt\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569487 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-syslog-receiver\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569509 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-trusted-ca\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569556 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/107cfbba-9034-4f35-adf0-801a876e9d52-tmp\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569578 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-token\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569603 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569623 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-sa-token\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569651 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/107cfbba-9034-4f35-adf0-801a876e9d52-datadir\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569675 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wznf\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-kube-api-access-8wznf\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.569721 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-entrypoint\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.570606 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-entrypoint\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: E0202 13:16:02.571916 4721 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Feb 02 13:16:02 crc kubenswrapper[4721]: E0202 13:16:02.572292 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-metrics podName:107cfbba-9034-4f35-adf0-801a876e9d52 nodeName:}" failed. No retries permitted until 2026-02-02 13:16:03.072273695 +0000 UTC m=+903.374788084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-metrics") pod "collector-x8xcn" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52") : secret "collector-metrics" not found Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.573358 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-trusted-ca\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.573470 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config-openshift-service-cacrt\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.573996 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/107cfbba-9034-4f35-adf0-801a876e9d52-datadir\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.574046 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.578669 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/107cfbba-9034-4f35-adf0-801a876e9d52-tmp\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.580260 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-syslog-receiver\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.580315 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-token\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.593465 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-sa-token\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.594299 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wznf\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-kube-api-access-8wznf\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670521 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-token\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670629 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/107cfbba-9034-4f35-adf0-801a876e9d52-datadir\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670672 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-syslog-receiver\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670706 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config-openshift-service-cacrt\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670806 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670875 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wznf\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-kube-api-access-8wznf\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670899 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-trusted-ca\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670934 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-entrypoint\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670966 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-sa-token\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.670994 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/107cfbba-9034-4f35-adf0-801a876e9d52-tmp\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.671685 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config" (OuterVolumeSpecName: "config") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.671867 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/107cfbba-9034-4f35-adf0-801a876e9d52-datadir" (OuterVolumeSpecName: "datadir") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.672132 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.672299 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.672662 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.675518 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-token" (OuterVolumeSpecName: "collector-token") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.679575 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.679612 4721 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-entrypoint\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.679623 4721 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-token\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.679632 4721 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/107cfbba-9034-4f35-adf0-801a876e9d52-datadir\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.679642 4721 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.679650 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/107cfbba-9034-4f35-adf0-801a876e9d52-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.682388 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.682423 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107cfbba-9034-4f35-adf0-801a876e9d52-tmp" (OuterVolumeSpecName: "tmp") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.685462 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-kube-api-access-8wznf" (OuterVolumeSpecName: "kube-api-access-8wznf") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "kube-api-access-8wznf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.691850 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-sa-token" (OuterVolumeSpecName: "sa-token") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.781046 4721 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.781319 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wznf\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-kube-api-access-8wznf\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.781383 4721 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/107cfbba-9034-4f35-adf0-801a876e9d52-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:02 crc kubenswrapper[4721]: I0202 13:16:02.781441 4721 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/107cfbba-9034-4f35-adf0-801a876e9d52-tmp\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.086831 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-metrics\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.092576 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-metrics\") pod \"collector-x8xcn\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " pod="openshift-logging/collector-x8xcn" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.188635 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-metrics\") pod \"107cfbba-9034-4f35-adf0-801a876e9d52\" (UID: \"107cfbba-9034-4f35-adf0-801a876e9d52\") " Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.191465 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-metrics" (OuterVolumeSpecName: "metrics") pod "107cfbba-9034-4f35-adf0-801a876e9d52" (UID: "107cfbba-9034-4f35-adf0-801a876e9d52"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.290674 4721 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/107cfbba-9034-4f35-adf0-801a876e9d52-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.517154 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-x8xcn" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.570435 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-x8xcn"] Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.581246 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-x8xcn"] Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.585745 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-ls7f7"] Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.586720 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.588840 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.592388 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.592582 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.592702 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-vwh88" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.592890 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.601547 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.609865 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-ls7f7"] Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.699941 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n64dr\" (UniqueName: \"kubernetes.io/projected/749232df-9bfe-43cb-a716-6eadd2cbc290-kube-api-access-n64dr\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700022 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/749232df-9bfe-43cb-a716-6eadd2cbc290-datadir\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700049 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/749232df-9bfe-43cb-a716-6eadd2cbc290-sa-token\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700201 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-config\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700275 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-entrypoint\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700301 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/749232df-9bfe-43cb-a716-6eadd2cbc290-metrics\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700321 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/749232df-9bfe-43cb-a716-6eadd2cbc290-collector-syslog-receiver\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700351 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/749232df-9bfe-43cb-a716-6eadd2cbc290-collector-token\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700486 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-trusted-ca\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700555 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-config-openshift-service-cacrt\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.700739 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/749232df-9bfe-43cb-a716-6eadd2cbc290-tmp\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.802369 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/749232df-9bfe-43cb-a716-6eadd2cbc290-tmp\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.802437 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n64dr\" (UniqueName: \"kubernetes.io/projected/749232df-9bfe-43cb-a716-6eadd2cbc290-kube-api-access-n64dr\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.802479 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/749232df-9bfe-43cb-a716-6eadd2cbc290-datadir\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.802497 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/749232df-9bfe-43cb-a716-6eadd2cbc290-sa-token\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.802511 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-config\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.802583 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/749232df-9bfe-43cb-a716-6eadd2cbc290-datadir\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.802964 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-entrypoint\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.802987 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/749232df-9bfe-43cb-a716-6eadd2cbc290-metrics\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.803004 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/749232df-9bfe-43cb-a716-6eadd2cbc290-collector-syslog-receiver\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.803881 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/749232df-9bfe-43cb-a716-6eadd2cbc290-collector-token\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.803833 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-config\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.803952 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-trusted-ca\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.804131 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-entrypoint\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.803975 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-config-openshift-service-cacrt\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.805107 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-config-openshift-service-cacrt\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.806508 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/749232df-9bfe-43cb-a716-6eadd2cbc290-trusted-ca\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.806670 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/749232df-9bfe-43cb-a716-6eadd2cbc290-tmp\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.806819 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/749232df-9bfe-43cb-a716-6eadd2cbc290-collector-syslog-receiver\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.807339 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/749232df-9bfe-43cb-a716-6eadd2cbc290-metrics\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.819371 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n64dr\" (UniqueName: \"kubernetes.io/projected/749232df-9bfe-43cb-a716-6eadd2cbc290-kube-api-access-n64dr\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.820116 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/749232df-9bfe-43cb-a716-6eadd2cbc290-collector-token\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.821723 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/749232df-9bfe-43cb-a716-6eadd2cbc290-sa-token\") pod \"collector-ls7f7\" (UID: \"749232df-9bfe-43cb-a716-6eadd2cbc290\") " pod="openshift-logging/collector-ls7f7" Feb 02 13:16:03 crc kubenswrapper[4721]: I0202 13:16:03.932517 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-ls7f7" Feb 02 13:16:04 crc kubenswrapper[4721]: I0202 13:16:04.349295 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-ls7f7"] Feb 02 13:16:04 crc kubenswrapper[4721]: I0202 13:16:04.417863 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107cfbba-9034-4f35-adf0-801a876e9d52" path="/var/lib/kubelet/pods/107cfbba-9034-4f35-adf0-801a876e9d52/volumes" Feb 02 13:16:04 crc kubenswrapper[4721]: I0202 13:16:04.525830 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-ls7f7" event={"ID":"749232df-9bfe-43cb-a716-6eadd2cbc290","Type":"ContainerStarted","Data":"a442f97eabaa9600446ad9cd88843ba6f3cfc81fcbeffbd34a8f940bc84f3190"} Feb 02 13:16:11 crc kubenswrapper[4721]: I0202 13:16:11.599688 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-ls7f7" event={"ID":"749232df-9bfe-43cb-a716-6eadd2cbc290","Type":"ContainerStarted","Data":"80a10c759d315ca6b2c4e5baabd5c2724352a80af6e7d18599a82ece805edbbe"} Feb 02 13:16:11 crc kubenswrapper[4721]: I0202 13:16:11.621702 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-ls7f7" podStartSLOduration=1.77901154 podStartE2EDuration="8.621678564s" podCreationTimestamp="2026-02-02 13:16:03 +0000 UTC" firstStartedPulling="2026-02-02 13:16:04.360030364 +0000 UTC m=+904.662544753" lastFinishedPulling="2026-02-02 13:16:11.202697388 +0000 UTC m=+911.505211777" observedRunningTime="2026-02-02 13:16:11.617682607 +0000 UTC m=+911.920197016" watchObservedRunningTime="2026-02-02 13:16:11.621678564 +0000 UTC m=+911.924192963" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.733851 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl"] Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.741325 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.743838 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.761183 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl"] Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.833895 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.833969 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.834018 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxxlb\" (UniqueName: \"kubernetes.io/projected/3d506202-dc87-49f6-9160-ccedb0cbae19-kube-api-access-fxxlb\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.934497 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.934560 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.934595 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxlb\" (UniqueName: \"kubernetes.io/projected/3d506202-dc87-49f6-9160-ccedb0cbae19-kube-api-access-fxxlb\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.935149 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.935262 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:41 crc kubenswrapper[4721]: I0202 13:16:41.960215 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxlb\" (UniqueName: \"kubernetes.io/projected/3d506202-dc87-49f6-9160-ccedb0cbae19-kube-api-access-fxxlb\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:42 crc kubenswrapper[4721]: I0202 13:16:42.062724 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:42 crc kubenswrapper[4721]: I0202 13:16:42.491079 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl"] Feb 02 13:16:42 crc kubenswrapper[4721]: I0202 13:16:42.820395 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" event={"ID":"3d506202-dc87-49f6-9160-ccedb0cbae19","Type":"ContainerStarted","Data":"52dc1a499dcb641663e9ecef3d7f9f4fb35334a80304139e481a1741543ca999"} Feb 02 13:16:42 crc kubenswrapper[4721]: I0202 13:16:42.820803 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" event={"ID":"3d506202-dc87-49f6-9160-ccedb0cbae19","Type":"ContainerStarted","Data":"d3848dde6c1b685fc6b63243fd6d7fb1adb23abdff42209ec7e990c5480d1f60"} Feb 02 13:16:43 crc kubenswrapper[4721]: I0202 13:16:43.828159 4721 generic.go:334] "Generic (PLEG): container finished" podID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerID="52dc1a499dcb641663e9ecef3d7f9f4fb35334a80304139e481a1741543ca999" exitCode=0 Feb 02 13:16:43 crc kubenswrapper[4721]: I0202 13:16:43.828266 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" event={"ID":"3d506202-dc87-49f6-9160-ccedb0cbae19","Type":"ContainerDied","Data":"52dc1a499dcb641663e9ecef3d7f9f4fb35334a80304139e481a1741543ca999"} Feb 02 13:16:45 crc kubenswrapper[4721]: I0202 13:16:45.849342 4721 generic.go:334] "Generic (PLEG): container finished" podID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerID="527f07b7f2fe1f7e6b7d5c148313ddb9f7dcbbbe099ba5a5515ef28eeb0ed2fa" exitCode=0 Feb 02 13:16:45 crc kubenswrapper[4721]: I0202 13:16:45.849522 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" event={"ID":"3d506202-dc87-49f6-9160-ccedb0cbae19","Type":"ContainerDied","Data":"527f07b7f2fe1f7e6b7d5c148313ddb9f7dcbbbe099ba5a5515ef28eeb0ed2fa"} Feb 02 13:16:46 crc kubenswrapper[4721]: I0202 13:16:46.859013 4721 generic.go:334] "Generic (PLEG): container finished" podID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerID="191d211ba575e63f5df87768edb2801a7a7407d76c4d1b7a509418e1f62193f2" exitCode=0 Feb 02 13:16:46 crc kubenswrapper[4721]: I0202 13:16:46.859175 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" event={"ID":"3d506202-dc87-49f6-9160-ccedb0cbae19","Type":"ContainerDied","Data":"191d211ba575e63f5df87768edb2801a7a7407d76c4d1b7a509418e1f62193f2"} Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.239221 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.435501 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxxlb\" (UniqueName: \"kubernetes.io/projected/3d506202-dc87-49f6-9160-ccedb0cbae19-kube-api-access-fxxlb\") pod \"3d506202-dc87-49f6-9160-ccedb0cbae19\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.436668 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-util\") pod \"3d506202-dc87-49f6-9160-ccedb0cbae19\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.436735 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-bundle\") pod \"3d506202-dc87-49f6-9160-ccedb0cbae19\" (UID: \"3d506202-dc87-49f6-9160-ccedb0cbae19\") " Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.437680 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-bundle" (OuterVolumeSpecName: "bundle") pod "3d506202-dc87-49f6-9160-ccedb0cbae19" (UID: "3d506202-dc87-49f6-9160-ccedb0cbae19"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.447350 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d506202-dc87-49f6-9160-ccedb0cbae19-kube-api-access-fxxlb" (OuterVolumeSpecName: "kube-api-access-fxxlb") pod "3d506202-dc87-49f6-9160-ccedb0cbae19" (UID: "3d506202-dc87-49f6-9160-ccedb0cbae19"). InnerVolumeSpecName "kube-api-access-fxxlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.450312 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-util" (OuterVolumeSpecName: "util") pod "3d506202-dc87-49f6-9160-ccedb0cbae19" (UID: "3d506202-dc87-49f6-9160-ccedb0cbae19"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.539969 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxxlb\" (UniqueName: \"kubernetes.io/projected/3d506202-dc87-49f6-9160-ccedb0cbae19-kube-api-access-fxxlb\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.540011 4721 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-util\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.540022 4721 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d506202-dc87-49f6-9160-ccedb0cbae19-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.874939 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" event={"ID":"3d506202-dc87-49f6-9160-ccedb0cbae19","Type":"ContainerDied","Data":"d3848dde6c1b685fc6b63243fd6d7fb1adb23abdff42209ec7e990c5480d1f60"} Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.874983 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3848dde6c1b685fc6b63243fd6d7fb1adb23abdff42209ec7e990c5480d1f60" Feb 02 13:16:48 crc kubenswrapper[4721]: I0202 13:16:48.875018 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.776522 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-trhxn"] Feb 02 13:16:51 crc kubenswrapper[4721]: E0202 13:16:51.777465 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerName="extract" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.777482 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerName="extract" Feb 02 13:16:51 crc kubenswrapper[4721]: E0202 13:16:51.777501 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerName="util" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.777509 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerName="util" Feb 02 13:16:51 crc kubenswrapper[4721]: E0202 13:16:51.777529 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerName="pull" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.777538 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerName="pull" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.777713 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d506202-dc87-49f6-9160-ccedb0cbae19" containerName="extract" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.778391 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-trhxn" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.780325 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.780338 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mp7tk" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.780946 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.794337 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-trhxn"] Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.888236 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f6tz\" (UniqueName: \"kubernetes.io/projected/38f375ca-8f76-4eb1-a92d-d46f7628ecf6-kube-api-access-9f6tz\") pod \"nmstate-operator-646758c888-trhxn\" (UID: \"38f375ca-8f76-4eb1-a92d-d46f7628ecf6\") " pod="openshift-nmstate/nmstate-operator-646758c888-trhxn" Feb 02 13:16:51 crc kubenswrapper[4721]: I0202 13:16:51.990022 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f6tz\" (UniqueName: \"kubernetes.io/projected/38f375ca-8f76-4eb1-a92d-d46f7628ecf6-kube-api-access-9f6tz\") pod \"nmstate-operator-646758c888-trhxn\" (UID: \"38f375ca-8f76-4eb1-a92d-d46f7628ecf6\") " pod="openshift-nmstate/nmstate-operator-646758c888-trhxn" Feb 02 13:16:52 crc kubenswrapper[4721]: I0202 13:16:52.020887 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f6tz\" (UniqueName: \"kubernetes.io/projected/38f375ca-8f76-4eb1-a92d-d46f7628ecf6-kube-api-access-9f6tz\") pod \"nmstate-operator-646758c888-trhxn\" (UID: \"38f375ca-8f76-4eb1-a92d-d46f7628ecf6\") " pod="openshift-nmstate/nmstate-operator-646758c888-trhxn" Feb 02 13:16:52 crc kubenswrapper[4721]: I0202 13:16:52.099605 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-trhxn" Feb 02 13:16:52 crc kubenswrapper[4721]: I0202 13:16:52.588570 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-trhxn"] Feb 02 13:16:52 crc kubenswrapper[4721]: I0202 13:16:52.902979 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-trhxn" event={"ID":"38f375ca-8f76-4eb1-a92d-d46f7628ecf6","Type":"ContainerStarted","Data":"2ddab410ccbb6b22dcf76c89181dd9a3f1757aee320a55c24ac86d3892d979bd"} Feb 02 13:16:54 crc kubenswrapper[4721]: I0202 13:16:54.925096 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-trhxn" event={"ID":"38f375ca-8f76-4eb1-a92d-d46f7628ecf6","Type":"ContainerStarted","Data":"016ee087f7d4223d77ed13a94ebcc70893dca997181bea85c835c27571ae8e35"} Feb 02 13:16:54 crc kubenswrapper[4721]: I0202 13:16:54.941808 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-trhxn" podStartSLOduration=2.055682439 podStartE2EDuration="3.941791984s" podCreationTimestamp="2026-02-02 13:16:51 +0000 UTC" firstStartedPulling="2026-02-02 13:16:52.594761554 +0000 UTC m=+952.897275943" lastFinishedPulling="2026-02-02 13:16:54.480871099 +0000 UTC m=+954.783385488" observedRunningTime="2026-02-02 13:16:54.938941068 +0000 UTC m=+955.241455457" watchObservedRunningTime="2026-02-02 13:16:54.941791984 +0000 UTC m=+955.244306383" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.612027 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl"] Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.613680 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.620912 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-mmg2n"] Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.621971 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-mmg2n" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.622410 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-drlbp" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.622556 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.641418 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl"] Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.641480 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-mmg2n"] Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.686863 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-dlvcq"] Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.687821 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.761556 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv8gh\" (UniqueName: \"kubernetes.io/projected/92d17aed-5894-45b3-8fe9-08b5dfc7c702-kube-api-access-bv8gh\") pod \"nmstate-webhook-8474b5b9d8-j4jzl\" (UID: \"92d17aed-5894-45b3-8fe9-08b5dfc7c702\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.761690 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q9j8\" (UniqueName: \"kubernetes.io/projected/be1a5420-ea1d-40e0-bd09-241151dc6755-kube-api-access-2q9j8\") pod \"nmstate-metrics-54757c584b-mmg2n\" (UID: \"be1a5420-ea1d-40e0-bd09-241151dc6755\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-mmg2n" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.761742 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92d17aed-5894-45b3-8fe9-08b5dfc7c702-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-j4jzl\" (UID: \"92d17aed-5894-45b3-8fe9-08b5dfc7c702\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.807142 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms"] Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.808548 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.811310 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-grgnx" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.811507 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.811659 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.839834 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms"] Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.864658 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q9j8\" (UniqueName: \"kubernetes.io/projected/be1a5420-ea1d-40e0-bd09-241151dc6755-kube-api-access-2q9j8\") pod \"nmstate-metrics-54757c584b-mmg2n\" (UID: \"be1a5420-ea1d-40e0-bd09-241151dc6755\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-mmg2n" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.864728 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b15ef257-c4ff-4fd9-a04c-a92d38e51b18-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2f9ms\" (UID: \"b15ef257-c4ff-4fd9-a04c-a92d38e51b18\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.864752 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92d17aed-5894-45b3-8fe9-08b5dfc7c702-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-j4jzl\" (UID: \"92d17aed-5894-45b3-8fe9-08b5dfc7c702\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.864832 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g77cp\" (UniqueName: \"kubernetes.io/projected/b15ef257-c4ff-4fd9-a04c-a92d38e51b18-kube-api-access-g77cp\") pod \"nmstate-console-plugin-7754f76f8b-2f9ms\" (UID: \"b15ef257-c4ff-4fd9-a04c-a92d38e51b18\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.864858 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-nmstate-lock\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.864886 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-ovs-socket\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.864917 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-dbus-socket\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.864964 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b15ef257-c4ff-4fd9-a04c-a92d38e51b18-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2f9ms\" (UID: \"b15ef257-c4ff-4fd9-a04c-a92d38e51b18\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.865039 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv8gh\" (UniqueName: \"kubernetes.io/projected/92d17aed-5894-45b3-8fe9-08b5dfc7c702-kube-api-access-bv8gh\") pod \"nmstate-webhook-8474b5b9d8-j4jzl\" (UID: \"92d17aed-5894-45b3-8fe9-08b5dfc7c702\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.865178 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glqnp\" (UniqueName: \"kubernetes.io/projected/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-kube-api-access-glqnp\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: E0202 13:17:02.865348 4721 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 02 13:17:02 crc kubenswrapper[4721]: E0202 13:17:02.865420 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92d17aed-5894-45b3-8fe9-08b5dfc7c702-tls-key-pair podName:92d17aed-5894-45b3-8fe9-08b5dfc7c702 nodeName:}" failed. No retries permitted until 2026-02-02 13:17:03.365384136 +0000 UTC m=+963.667898525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/92d17aed-5894-45b3-8fe9-08b5dfc7c702-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-j4jzl" (UID: "92d17aed-5894-45b3-8fe9-08b5dfc7c702") : secret "openshift-nmstate-webhook" not found Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.894803 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv8gh\" (UniqueName: \"kubernetes.io/projected/92d17aed-5894-45b3-8fe9-08b5dfc7c702-kube-api-access-bv8gh\") pod \"nmstate-webhook-8474b5b9d8-j4jzl\" (UID: \"92d17aed-5894-45b3-8fe9-08b5dfc7c702\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.894828 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q9j8\" (UniqueName: \"kubernetes.io/projected/be1a5420-ea1d-40e0-bd09-241151dc6755-kube-api-access-2q9j8\") pod \"nmstate-metrics-54757c584b-mmg2n\" (UID: \"be1a5420-ea1d-40e0-bd09-241151dc6755\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-mmg2n" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969093 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-ovs-socket\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969138 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-dbus-socket\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969168 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b15ef257-c4ff-4fd9-a04c-a92d38e51b18-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2f9ms\" (UID: \"b15ef257-c4ff-4fd9-a04c-a92d38e51b18\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969240 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glqnp\" (UniqueName: \"kubernetes.io/projected/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-kube-api-access-glqnp\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969284 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b15ef257-c4ff-4fd9-a04c-a92d38e51b18-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2f9ms\" (UID: \"b15ef257-c4ff-4fd9-a04c-a92d38e51b18\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969318 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g77cp\" (UniqueName: \"kubernetes.io/projected/b15ef257-c4ff-4fd9-a04c-a92d38e51b18-kube-api-access-g77cp\") pod \"nmstate-console-plugin-7754f76f8b-2f9ms\" (UID: \"b15ef257-c4ff-4fd9-a04c-a92d38e51b18\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969333 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-nmstate-lock\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969397 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-nmstate-lock\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969434 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-ovs-socket\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.969642 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-dbus-socket\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.972353 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b15ef257-c4ff-4fd9-a04c-a92d38e51b18-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2f9ms\" (UID: \"b15ef257-c4ff-4fd9-a04c-a92d38e51b18\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.988247 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glqnp\" (UniqueName: \"kubernetes.io/projected/1cf5f077-bb9b-42de-ab25-70b762c3e2e1-kube-api-access-glqnp\") pod \"nmstate-handler-dlvcq\" (UID: \"1cf5f077-bb9b-42de-ab25-70b762c3e2e1\") " pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.992965 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b15ef257-c4ff-4fd9-a04c-a92d38e51b18-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2f9ms\" (UID: \"b15ef257-c4ff-4fd9-a04c-a92d38e51b18\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.996991 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g77cp\" (UniqueName: \"kubernetes.io/projected/b15ef257-c4ff-4fd9-a04c-a92d38e51b18-kube-api-access-g77cp\") pod \"nmstate-console-plugin-7754f76f8b-2f9ms\" (UID: \"b15ef257-c4ff-4fd9-a04c-a92d38e51b18\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:02 crc kubenswrapper[4721]: I0202 13:17:02.999024 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-mmg2n" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.027263 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.086485 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-679d56c757-8hcnt"] Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.088088 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.105243 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679d56c757-8hcnt"] Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.141497 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.279361 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-console-config\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.279765 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-trusted-ca-bundle\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.279803 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xhnw\" (UniqueName: \"kubernetes.io/projected/1e715356-9848-439f-a13d-eb00f34521ec-kube-api-access-7xhnw\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.279836 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-serving-cert\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.279942 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-oauth-serving-cert\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.279984 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-service-ca\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.280015 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-oauth-config\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.381566 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-trusted-ca-bundle\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.381621 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xhnw\" (UniqueName: \"kubernetes.io/projected/1e715356-9848-439f-a13d-eb00f34521ec-kube-api-access-7xhnw\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.381654 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-serving-cert\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.381733 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-oauth-serving-cert\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.381767 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92d17aed-5894-45b3-8fe9-08b5dfc7c702-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-j4jzl\" (UID: \"92d17aed-5894-45b3-8fe9-08b5dfc7c702\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.381797 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-service-ca\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.381827 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-oauth-config\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.381872 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-console-config\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.383003 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-console-config\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.383944 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-trusted-ca-bundle\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.384967 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-service-ca\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.393936 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-oauth-serving-cert\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.416719 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92d17aed-5894-45b3-8fe9-08b5dfc7c702-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-j4jzl\" (UID: \"92d17aed-5894-45b3-8fe9-08b5dfc7c702\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.416719 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-serving-cert\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.416925 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xhnw\" (UniqueName: \"kubernetes.io/projected/1e715356-9848-439f-a13d-eb00f34521ec-kube-api-access-7xhnw\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.428912 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-oauth-config\") pod \"console-679d56c757-8hcnt\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.430049 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.588186 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.612477 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-mmg2n"] Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.775276 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms"] Feb 02 13:17:03 crc kubenswrapper[4721]: I0202 13:17:03.902544 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-679d56c757-8hcnt"] Feb 02 13:17:04 crc kubenswrapper[4721]: I0202 13:17:04.008218 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dlvcq" event={"ID":"1cf5f077-bb9b-42de-ab25-70b762c3e2e1","Type":"ContainerStarted","Data":"e6d62e3bcaad0869efdc8f536858a527a68a1600b89c91357fcafe050c0e7c1c"} Feb 02 13:17:04 crc kubenswrapper[4721]: I0202 13:17:04.010293 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679d56c757-8hcnt" event={"ID":"1e715356-9848-439f-a13d-eb00f34521ec","Type":"ContainerStarted","Data":"d99eec54d3234b8ee9ca1f4e6b988bce26f945deba80a7904e833116e7ebcfe1"} Feb 02 13:17:04 crc kubenswrapper[4721]: I0202 13:17:04.012037 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" event={"ID":"b15ef257-c4ff-4fd9-a04c-a92d38e51b18","Type":"ContainerStarted","Data":"f5df7a958ca3a57c3410e81afbd829f4ca8d15fe78fdbc6bed875f29e9b9702b"} Feb 02 13:17:04 crc kubenswrapper[4721]: I0202 13:17:04.012827 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-mmg2n" event={"ID":"be1a5420-ea1d-40e0-bd09-241151dc6755","Type":"ContainerStarted","Data":"5a1ef20cfb826a1d5ddd63f0a19e9f8d9cdd66f781f8f1099ecc33c1377f0be1"} Feb 02 13:17:04 crc kubenswrapper[4721]: I0202 13:17:04.088827 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl"] Feb 02 13:17:04 crc kubenswrapper[4721]: W0202 13:17:04.096608 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d17aed_5894_45b3_8fe9_08b5dfc7c702.slice/crio-81d4f57384025303abfca59d4299f8c8ec61cd16930ed9f758531ad503fb5fb2 WatchSource:0}: Error finding container 81d4f57384025303abfca59d4299f8c8ec61cd16930ed9f758531ad503fb5fb2: Status 404 returned error can't find the container with id 81d4f57384025303abfca59d4299f8c8ec61cd16930ed9f758531ad503fb5fb2 Feb 02 13:17:05 crc kubenswrapper[4721]: I0202 13:17:05.038304 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679d56c757-8hcnt" event={"ID":"1e715356-9848-439f-a13d-eb00f34521ec","Type":"ContainerStarted","Data":"2105e396598b1fd13640d6e576494052ae1ae901f42a2b1dd0e7f495d1eec506"} Feb 02 13:17:05 crc kubenswrapper[4721]: I0202 13:17:05.043908 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" event={"ID":"92d17aed-5894-45b3-8fe9-08b5dfc7c702","Type":"ContainerStarted","Data":"81d4f57384025303abfca59d4299f8c8ec61cd16930ed9f758531ad503fb5fb2"} Feb 02 13:17:05 crc kubenswrapper[4721]: I0202 13:17:05.061304 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-679d56c757-8hcnt" podStartSLOduration=2.061288791 podStartE2EDuration="2.061288791s" podCreationTimestamp="2026-02-02 13:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:17:05.054932942 +0000 UTC m=+965.357447331" watchObservedRunningTime="2026-02-02 13:17:05.061288791 +0000 UTC m=+965.363803180" Feb 02 13:17:07 crc kubenswrapper[4721]: I0202 13:17:07.067104 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-mmg2n" event={"ID":"be1a5420-ea1d-40e0-bd09-241151dc6755","Type":"ContainerStarted","Data":"e76db2fc53c20700f0800b081757abc8a1bff0a9ba54d9e7e3778d9cdd35190d"} Feb 02 13:17:07 crc kubenswrapper[4721]: I0202 13:17:07.069417 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dlvcq" event={"ID":"1cf5f077-bb9b-42de-ab25-70b762c3e2e1","Type":"ContainerStarted","Data":"88b7c0f5e7f7001f819fe2a158afa2cb6a24a9c28b6b363ff8d409e4b5d9b1d4"} Feb 02 13:17:07 crc kubenswrapper[4721]: I0202 13:17:07.069587 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:07 crc kubenswrapper[4721]: I0202 13:17:07.071814 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" event={"ID":"92d17aed-5894-45b3-8fe9-08b5dfc7c702","Type":"ContainerStarted","Data":"1a9b0f637a6b22e8f8a09de1c4b8732bc72e65bdab10148e4aa573d3a43f94b1"} Feb 02 13:17:07 crc kubenswrapper[4721]: I0202 13:17:07.072016 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:07 crc kubenswrapper[4721]: I0202 13:17:07.074579 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" event={"ID":"b15ef257-c4ff-4fd9-a04c-a92d38e51b18","Type":"ContainerStarted","Data":"1d6e2225cab3e5a250021885a92cb8e5bcf5e204e570e588a2e471239e0a7a52"} Feb 02 13:17:07 crc kubenswrapper[4721]: I0202 13:17:07.093077 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-dlvcq" podStartSLOduration=1.638088556 podStartE2EDuration="5.093042736s" podCreationTimestamp="2026-02-02 13:17:02 +0000 UTC" firstStartedPulling="2026-02-02 13:17:03.075902036 +0000 UTC m=+963.378416425" lastFinishedPulling="2026-02-02 13:17:06.530856186 +0000 UTC m=+966.833370605" observedRunningTime="2026-02-02 13:17:07.089171342 +0000 UTC m=+967.391685731" watchObservedRunningTime="2026-02-02 13:17:07.093042736 +0000 UTC m=+967.395557125" Feb 02 13:17:07 crc kubenswrapper[4721]: I0202 13:17:07.106487 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2f9ms" podStartSLOduration=2.39822235 podStartE2EDuration="5.106466563s" podCreationTimestamp="2026-02-02 13:17:02 +0000 UTC" firstStartedPulling="2026-02-02 13:17:03.790569287 +0000 UTC m=+964.093083676" lastFinishedPulling="2026-02-02 13:17:06.4988135 +0000 UTC m=+966.801327889" observedRunningTime="2026-02-02 13:17:07.102444636 +0000 UTC m=+967.404959035" watchObservedRunningTime="2026-02-02 13:17:07.106466563 +0000 UTC m=+967.408980952" Feb 02 13:17:07 crc kubenswrapper[4721]: I0202 13:17:07.139552 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" podStartSLOduration=2.7089788759999998 podStartE2EDuration="5.139533046s" podCreationTimestamp="2026-02-02 13:17:02 +0000 UTC" firstStartedPulling="2026-02-02 13:17:04.099031602 +0000 UTC m=+964.401545991" lastFinishedPulling="2026-02-02 13:17:06.529585752 +0000 UTC m=+966.832100161" observedRunningTime="2026-02-02 13:17:07.133640719 +0000 UTC m=+967.436155108" watchObservedRunningTime="2026-02-02 13:17:07.139533046 +0000 UTC m=+967.442047435" Feb 02 13:17:10 crc kubenswrapper[4721]: I0202 13:17:10.102462 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-mmg2n" event={"ID":"be1a5420-ea1d-40e0-bd09-241151dc6755","Type":"ContainerStarted","Data":"755586293cc8fe6b5dfd88bce2edec9ea124a377de4b078172f8d3e7eae575b8"} Feb 02 13:17:10 crc kubenswrapper[4721]: I0202 13:17:10.128651 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-mmg2n" podStartSLOduration=2.194615374 podStartE2EDuration="8.128625059s" podCreationTimestamp="2026-02-02 13:17:02 +0000 UTC" firstStartedPulling="2026-02-02 13:17:03.633690138 +0000 UTC m=+963.936204527" lastFinishedPulling="2026-02-02 13:17:09.567699823 +0000 UTC m=+969.870214212" observedRunningTime="2026-02-02 13:17:10.120840461 +0000 UTC m=+970.423354850" watchObservedRunningTime="2026-02-02 13:17:10.128625059 +0000 UTC m=+970.431139458" Feb 02 13:17:13 crc kubenswrapper[4721]: I0202 13:17:13.054228 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-dlvcq" Feb 02 13:17:13 crc kubenswrapper[4721]: I0202 13:17:13.432027 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:13 crc kubenswrapper[4721]: I0202 13:17:13.432300 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:13 crc kubenswrapper[4721]: I0202 13:17:13.436621 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:14 crc kubenswrapper[4721]: I0202 13:17:14.131035 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:17:14 crc kubenswrapper[4721]: I0202 13:17:14.187704 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-858d4f646b-v8xpv"] Feb 02 13:17:23 crc kubenswrapper[4721]: I0202 13:17:23.594404 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-j4jzl" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.238508 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-858d4f646b-v8xpv" podUID="5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" containerName="console" containerID="cri-o://dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66" gracePeriod=15 Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.702251 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-858d4f646b-v8xpv_5cb1d5e0-e67a-459b-ad6a-794d2f8bab70/console/0.log" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.702616 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.795168 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-config\") pod \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.795259 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-serving-cert\") pod \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.795373 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-oauth-serving-cert\") pod \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.795409 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-658px\" (UniqueName: \"kubernetes.io/projected/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-kube-api-access-658px\") pod \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.795467 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-oauth-config\") pod \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.795549 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-trusted-ca-bundle\") pod \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.795577 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-service-ca\") pod \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\" (UID: \"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70\") " Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.795850 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" (UID: "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.795844 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-config" (OuterVolumeSpecName: "console-config") pod "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" (UID: "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.796212 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" (UID: "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.796208 4721 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.796265 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-service-ca" (OuterVolumeSpecName: "service-ca") pod "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" (UID: "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.801334 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" (UID: "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.804562 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" (UID: "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.805123 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-kube-api-access-658px" (OuterVolumeSpecName: "kube-api-access-658px") pod "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" (UID: "5cb1d5e0-e67a-459b-ad6a-794d2f8bab70"). InnerVolumeSpecName "kube-api-access-658px". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.897468 4721 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.897506 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.897516 4721 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.897526 4721 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.897534 4721 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:39 crc kubenswrapper[4721]: I0202 13:17:39.897544 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-658px\" (UniqueName: \"kubernetes.io/projected/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70-kube-api-access-658px\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.318175 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-858d4f646b-v8xpv_5cb1d5e0-e67a-459b-ad6a-794d2f8bab70/console/0.log" Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.318223 4721 generic.go:334] "Generic (PLEG): container finished" podID="5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" containerID="dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66" exitCode=2 Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.318253 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858d4f646b-v8xpv" event={"ID":"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70","Type":"ContainerDied","Data":"dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66"} Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.318279 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858d4f646b-v8xpv" event={"ID":"5cb1d5e0-e67a-459b-ad6a-794d2f8bab70","Type":"ContainerDied","Data":"550ae8a7d8f413e415a65a6ec4a23601971fd9d7d7d542219e95480b30d156d6"} Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.318295 4721 scope.go:117] "RemoveContainer" containerID="dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66" Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.318305 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858d4f646b-v8xpv" Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.344163 4721 scope.go:117] "RemoveContainer" containerID="dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66" Feb 02 13:17:40 crc kubenswrapper[4721]: E0202 13:17:40.346258 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66\": container with ID starting with dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66 not found: ID does not exist" containerID="dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66" Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.346293 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66"} err="failed to get container status \"dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66\": rpc error: code = NotFound desc = could not find container \"dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66\": container with ID starting with dd8792e47fe91e154df2c809101cbf81ddf00e86f1e989f7de7b429baf8fae66 not found: ID does not exist" Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.355530 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-858d4f646b-v8xpv"] Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.365997 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-858d4f646b-v8xpv"] Feb 02 13:17:40 crc kubenswrapper[4721]: I0202 13:17:40.419507 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" path="/var/lib/kubelet/pods/5cb1d5e0-e67a-459b-ad6a-794d2f8bab70/volumes" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.051931 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg"] Feb 02 13:17:41 crc kubenswrapper[4721]: E0202 13:17:41.052284 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" containerName="console" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.052300 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" containerName="console" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.052436 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cb1d5e0-e67a-459b-ad6a-794d2f8bab70" containerName="console" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.053642 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.056335 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.067309 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg"] Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.219053 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtl8v\" (UniqueName: \"kubernetes.io/projected/6f007b81-04cd-4913-ad24-51aa6c5b60c8-kube-api-access-gtl8v\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.219424 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.219546 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.320741 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtl8v\" (UniqueName: \"kubernetes.io/projected/6f007b81-04cd-4913-ad24-51aa6c5b60c8-kube-api-access-gtl8v\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.320818 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.320852 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.321415 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.321466 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.340543 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtl8v\" (UniqueName: \"kubernetes.io/projected/6f007b81-04cd-4913-ad24-51aa6c5b60c8-kube-api-access-gtl8v\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.392891 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:41 crc kubenswrapper[4721]: I0202 13:17:41.814319 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg"] Feb 02 13:17:42 crc kubenswrapper[4721]: I0202 13:17:42.338668 4721 generic.go:334] "Generic (PLEG): container finished" podID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerID="28f91195297a776d46f7db103d4985273de948baa29cc9a5e82f3cc4d87a54e0" exitCode=0 Feb 02 13:17:42 crc kubenswrapper[4721]: I0202 13:17:42.338849 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" event={"ID":"6f007b81-04cd-4913-ad24-51aa6c5b60c8","Type":"ContainerDied","Data":"28f91195297a776d46f7db103d4985273de948baa29cc9a5e82f3cc4d87a54e0"} Feb 02 13:17:42 crc kubenswrapper[4721]: I0202 13:17:42.339789 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" event={"ID":"6f007b81-04cd-4913-ad24-51aa6c5b60c8","Type":"ContainerStarted","Data":"fcc3c9a977e019475fc2f0060552e4b108760efb6fd036a923df095189c99a45"} Feb 02 13:17:42 crc kubenswrapper[4721]: I0202 13:17:42.340415 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:17:44 crc kubenswrapper[4721]: I0202 13:17:44.354461 4721 generic.go:334] "Generic (PLEG): container finished" podID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerID="83ca642d436fbfde4d4a1af0b2ffeac20b596005b9eeaab9d33b871989acea35" exitCode=0 Feb 02 13:17:44 crc kubenswrapper[4721]: I0202 13:17:44.354551 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" event={"ID":"6f007b81-04cd-4913-ad24-51aa6c5b60c8","Type":"ContainerDied","Data":"83ca642d436fbfde4d4a1af0b2ffeac20b596005b9eeaab9d33b871989acea35"} Feb 02 13:17:44 crc kubenswrapper[4721]: I0202 13:17:44.763747 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:17:44 crc kubenswrapper[4721]: I0202 13:17:44.763810 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:17:45 crc kubenswrapper[4721]: I0202 13:17:45.366012 4721 generic.go:334] "Generic (PLEG): container finished" podID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerID="05c7b1d8da5ec1121b2db7edd10eb8b8157a4017a8a0c59b7beee7bbcbde7f2e" exitCode=0 Feb 02 13:17:45 crc kubenswrapper[4721]: I0202 13:17:45.366117 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" event={"ID":"6f007b81-04cd-4913-ad24-51aa6c5b60c8","Type":"ContainerDied","Data":"05c7b1d8da5ec1121b2db7edd10eb8b8157a4017a8a0c59b7beee7bbcbde7f2e"} Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.670785 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.813125 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-bundle\") pod \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.813179 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtl8v\" (UniqueName: \"kubernetes.io/projected/6f007b81-04cd-4913-ad24-51aa6c5b60c8-kube-api-access-gtl8v\") pod \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.813261 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-util\") pod \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\" (UID: \"6f007b81-04cd-4913-ad24-51aa6c5b60c8\") " Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.814777 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-bundle" (OuterVolumeSpecName: "bundle") pod "6f007b81-04cd-4913-ad24-51aa6c5b60c8" (UID: "6f007b81-04cd-4913-ad24-51aa6c5b60c8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.819329 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f007b81-04cd-4913-ad24-51aa6c5b60c8-kube-api-access-gtl8v" (OuterVolumeSpecName: "kube-api-access-gtl8v") pod "6f007b81-04cd-4913-ad24-51aa6c5b60c8" (UID: "6f007b81-04cd-4913-ad24-51aa6c5b60c8"). InnerVolumeSpecName "kube-api-access-gtl8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.829341 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-util" (OuterVolumeSpecName: "util") pod "6f007b81-04cd-4913-ad24-51aa6c5b60c8" (UID: "6f007b81-04cd-4913-ad24-51aa6c5b60c8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.915381 4721 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.915469 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtl8v\" (UniqueName: \"kubernetes.io/projected/6f007b81-04cd-4913-ad24-51aa6c5b60c8-kube-api-access-gtl8v\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:46 crc kubenswrapper[4721]: I0202 13:17:46.915532 4721 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f007b81-04cd-4913-ad24-51aa6c5b60c8-util\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:47 crc kubenswrapper[4721]: I0202 13:17:47.380129 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" event={"ID":"6f007b81-04cd-4913-ad24-51aa6c5b60c8","Type":"ContainerDied","Data":"fcc3c9a977e019475fc2f0060552e4b108760efb6fd036a923df095189c99a45"} Feb 02 13:17:47 crc kubenswrapper[4721]: I0202 13:17:47.380165 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcc3c9a977e019475fc2f0060552e4b108760efb6fd036a923df095189c99a45" Feb 02 13:17:47 crc kubenswrapper[4721]: I0202 13:17:47.380226 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.449864 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz"] Feb 02 13:17:56 crc kubenswrapper[4721]: E0202 13:17:56.450769 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerName="util" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.450785 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerName="util" Feb 02 13:17:56 crc kubenswrapper[4721]: E0202 13:17:56.450805 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerName="extract" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.450812 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerName="extract" Feb 02 13:17:56 crc kubenswrapper[4721]: E0202 13:17:56.450832 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerName="pull" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.450841 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerName="pull" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.451033 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f007b81-04cd-4913-ad24-51aa6c5b60c8" containerName="extract" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.451723 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.453722 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dhg9g" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.454350 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.454690 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.454965 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.469512 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.476726 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz"] Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.569095 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c6e741b-2539-4be0-898c-5fee37f67d21-webhook-cert\") pod \"metallb-operator-controller-manager-67895b6557-xpzcz\" (UID: \"4c6e741b-2539-4be0-898c-5fee37f67d21\") " pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.569368 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjmbx\" (UniqueName: \"kubernetes.io/projected/4c6e741b-2539-4be0-898c-5fee37f67d21-kube-api-access-kjmbx\") pod \"metallb-operator-controller-manager-67895b6557-xpzcz\" (UID: \"4c6e741b-2539-4be0-898c-5fee37f67d21\") " pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.569696 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c6e741b-2539-4be0-898c-5fee37f67d21-apiservice-cert\") pod \"metallb-operator-controller-manager-67895b6557-xpzcz\" (UID: \"4c6e741b-2539-4be0-898c-5fee37f67d21\") " pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.671153 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c6e741b-2539-4be0-898c-5fee37f67d21-apiservice-cert\") pod \"metallb-operator-controller-manager-67895b6557-xpzcz\" (UID: \"4c6e741b-2539-4be0-898c-5fee37f67d21\") " pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.671218 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c6e741b-2539-4be0-898c-5fee37f67d21-webhook-cert\") pod \"metallb-operator-controller-manager-67895b6557-xpzcz\" (UID: \"4c6e741b-2539-4be0-898c-5fee37f67d21\") " pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.671264 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjmbx\" (UniqueName: \"kubernetes.io/projected/4c6e741b-2539-4be0-898c-5fee37f67d21-kube-api-access-kjmbx\") pod \"metallb-operator-controller-manager-67895b6557-xpzcz\" (UID: \"4c6e741b-2539-4be0-898c-5fee37f67d21\") " pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.682802 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd"] Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.683936 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.685908 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.686046 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.686193 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-d48t5" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.688342 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4c6e741b-2539-4be0-898c-5fee37f67d21-apiservice-cert\") pod \"metallb-operator-controller-manager-67895b6557-xpzcz\" (UID: \"4c6e741b-2539-4be0-898c-5fee37f67d21\") " pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.689339 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjmbx\" (UniqueName: \"kubernetes.io/projected/4c6e741b-2539-4be0-898c-5fee37f67d21-kube-api-access-kjmbx\") pod \"metallb-operator-controller-manager-67895b6557-xpzcz\" (UID: \"4c6e741b-2539-4be0-898c-5fee37f67d21\") " pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.689887 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4c6e741b-2539-4be0-898c-5fee37f67d21-webhook-cert\") pod \"metallb-operator-controller-manager-67895b6557-xpzcz\" (UID: \"4c6e741b-2539-4be0-898c-5fee37f67d21\") " pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.702578 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd"] Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.770604 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.772629 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10a7b124-f250-42d3-9e7c-af29d7204edb-webhook-cert\") pod \"metallb-operator-webhook-server-bb6bc86c6-l2cpd\" (UID: \"10a7b124-f250-42d3-9e7c-af29d7204edb\") " pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.772664 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10a7b124-f250-42d3-9e7c-af29d7204edb-apiservice-cert\") pod \"metallb-operator-webhook-server-bb6bc86c6-l2cpd\" (UID: \"10a7b124-f250-42d3-9e7c-af29d7204edb\") " pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.772764 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dnfp\" (UniqueName: \"kubernetes.io/projected/10a7b124-f250-42d3-9e7c-af29d7204edb-kube-api-access-4dnfp\") pod \"metallb-operator-webhook-server-bb6bc86c6-l2cpd\" (UID: \"10a7b124-f250-42d3-9e7c-af29d7204edb\") " pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.874855 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dnfp\" (UniqueName: \"kubernetes.io/projected/10a7b124-f250-42d3-9e7c-af29d7204edb-kube-api-access-4dnfp\") pod \"metallb-operator-webhook-server-bb6bc86c6-l2cpd\" (UID: \"10a7b124-f250-42d3-9e7c-af29d7204edb\") " pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.875306 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10a7b124-f250-42d3-9e7c-af29d7204edb-webhook-cert\") pod \"metallb-operator-webhook-server-bb6bc86c6-l2cpd\" (UID: \"10a7b124-f250-42d3-9e7c-af29d7204edb\") " pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.875345 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10a7b124-f250-42d3-9e7c-af29d7204edb-apiservice-cert\") pod \"metallb-operator-webhook-server-bb6bc86c6-l2cpd\" (UID: \"10a7b124-f250-42d3-9e7c-af29d7204edb\") " pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.880814 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10a7b124-f250-42d3-9e7c-af29d7204edb-apiservice-cert\") pod \"metallb-operator-webhook-server-bb6bc86c6-l2cpd\" (UID: \"10a7b124-f250-42d3-9e7c-af29d7204edb\") " pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.883785 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10a7b124-f250-42d3-9e7c-af29d7204edb-webhook-cert\") pod \"metallb-operator-webhook-server-bb6bc86c6-l2cpd\" (UID: \"10a7b124-f250-42d3-9e7c-af29d7204edb\") " pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:56 crc kubenswrapper[4721]: I0202 13:17:56.898030 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dnfp\" (UniqueName: \"kubernetes.io/projected/10a7b124-f250-42d3-9e7c-af29d7204edb-kube-api-access-4dnfp\") pod \"metallb-operator-webhook-server-bb6bc86c6-l2cpd\" (UID: \"10a7b124-f250-42d3-9e7c-af29d7204edb\") " pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:57 crc kubenswrapper[4721]: I0202 13:17:57.072151 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:17:57 crc kubenswrapper[4721]: I0202 13:17:57.374267 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz"] Feb 02 13:17:57 crc kubenswrapper[4721]: I0202 13:17:57.448330 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" event={"ID":"4c6e741b-2539-4be0-898c-5fee37f67d21","Type":"ContainerStarted","Data":"a51b6dfaf849b0aa96870de27c5a75504a8a4ffcf6aed84b60bdfcf3507c6156"} Feb 02 13:17:57 crc kubenswrapper[4721]: I0202 13:17:57.458841 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd"] Feb 02 13:17:58 crc kubenswrapper[4721]: I0202 13:17:58.469019 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" event={"ID":"10a7b124-f250-42d3-9e7c-af29d7204edb","Type":"ContainerStarted","Data":"ab0b511b610a9c06890090fa80bc04f289f5e46dd00161f5fce348a49b20787d"} Feb 02 13:18:02 crc kubenswrapper[4721]: I0202 13:18:02.521312 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" event={"ID":"10a7b124-f250-42d3-9e7c-af29d7204edb","Type":"ContainerStarted","Data":"be875bc33c8967c4d3e45833bc59c06ce1a6a4bc3b27d6a16bf8bbc3482ffb1b"} Feb 02 13:18:02 crc kubenswrapper[4721]: I0202 13:18:02.522851 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:18:02 crc kubenswrapper[4721]: I0202 13:18:02.524194 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" event={"ID":"4c6e741b-2539-4be0-898c-5fee37f67d21","Type":"ContainerStarted","Data":"c61d42ada705799b12731aca39ecad8f807265698055e33634526ddffcf5831e"} Feb 02 13:18:02 crc kubenswrapper[4721]: I0202 13:18:02.524616 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:18:02 crc kubenswrapper[4721]: I0202 13:18:02.547796 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" podStartSLOduration=2.142755597 podStartE2EDuration="6.547775597s" podCreationTimestamp="2026-02-02 13:17:56 +0000 UTC" firstStartedPulling="2026-02-02 13:17:57.465251046 +0000 UTC m=+1017.767765435" lastFinishedPulling="2026-02-02 13:18:01.870271046 +0000 UTC m=+1022.172785435" observedRunningTime="2026-02-02 13:18:02.540923073 +0000 UTC m=+1022.843437462" watchObservedRunningTime="2026-02-02 13:18:02.547775597 +0000 UTC m=+1022.850289986" Feb 02 13:18:02 crc kubenswrapper[4721]: I0202 13:18:02.572242 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" podStartSLOduration=2.118255718 podStartE2EDuration="6.572221784s" podCreationTimestamp="2026-02-02 13:17:56 +0000 UTC" firstStartedPulling="2026-02-02 13:17:57.396534718 +0000 UTC m=+1017.699049107" lastFinishedPulling="2026-02-02 13:18:01.850500784 +0000 UTC m=+1022.153015173" observedRunningTime="2026-02-02 13:18:02.562312708 +0000 UTC m=+1022.864827127" watchObservedRunningTime="2026-02-02 13:18:02.572221784 +0000 UTC m=+1022.874736183" Feb 02 13:18:14 crc kubenswrapper[4721]: I0202 13:18:14.764803 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:18:14 crc kubenswrapper[4721]: I0202 13:18:14.765468 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:18:17 crc kubenswrapper[4721]: I0202 13:18:17.078571 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-bb6bc86c6-l2cpd" Feb 02 13:18:36 crc kubenswrapper[4721]: I0202 13:18:36.780990 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-67895b6557-xpzcz" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.436351 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8ts6n"] Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.439231 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.441301 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.441454 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.441532 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7k24k" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.451942 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn"] Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.452897 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.454621 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.465052 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn"] Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.498643 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fda33e0-d0a3-4266-aeb1-fc07965d8c35-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-4t8pn\" (UID: \"4fda33e0-d0a3-4266-aeb1-fc07965d8c35\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.498905 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-frr-conf\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.498949 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-frr-sockets\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.498987 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwfqg\" (UniqueName: \"kubernetes.io/projected/4fda33e0-d0a3-4266-aeb1-fc07965d8c35-kube-api-access-mwfqg\") pod \"frr-k8s-webhook-server-7df86c4f6c-4t8pn\" (UID: \"4fda33e0-d0a3-4266-aeb1-fc07965d8c35\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.499044 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-reloader\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.499096 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2wbp\" (UniqueName: \"kubernetes.io/projected/5f685485-23a9-45dd-90cd-62ab47eab713-kube-api-access-n2wbp\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.499120 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-metrics\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.499139 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f685485-23a9-45dd-90cd-62ab47eab713-metrics-certs\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.499171 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5f685485-23a9-45dd-90cd-62ab47eab713-frr-startup\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.548189 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2hhvl"] Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.549693 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.551288 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.551356 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.551601 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kmfnz" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.551636 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.566308 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-rq76j"] Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.567612 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.571646 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.611619 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-reloader\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.611741 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2wbp\" (UniqueName: \"kubernetes.io/projected/5f685485-23a9-45dd-90cd-62ab47eab713-kube-api-access-n2wbp\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.611781 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-metrics\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.611797 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f685485-23a9-45dd-90cd-62ab47eab713-metrics-certs\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.611855 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5f685485-23a9-45dd-90cd-62ab47eab713-frr-startup\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.611966 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fda33e0-d0a3-4266-aeb1-fc07965d8c35-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-4t8pn\" (UID: \"4fda33e0-d0a3-4266-aeb1-fc07965d8c35\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.611996 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-frr-conf\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.612045 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-frr-sockets\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.612109 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwfqg\" (UniqueName: \"kubernetes.io/projected/4fda33e0-d0a3-4266-aeb1-fc07965d8c35-kube-api-access-mwfqg\") pod \"frr-k8s-webhook-server-7df86c4f6c-4t8pn\" (UID: \"4fda33e0-d0a3-4266-aeb1-fc07965d8c35\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:37 crc kubenswrapper[4721]: E0202 13:18:37.612554 4721 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.612566 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-reloader\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: E0202 13:18:37.612618 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fda33e0-d0a3-4266-aeb1-fc07965d8c35-cert podName:4fda33e0-d0a3-4266-aeb1-fc07965d8c35 nodeName:}" failed. No retries permitted until 2026-02-02 13:18:38.112596932 +0000 UTC m=+1058.415111321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4fda33e0-d0a3-4266-aeb1-fc07965d8c35-cert") pod "frr-k8s-webhook-server-7df86c4f6c-4t8pn" (UID: "4fda33e0-d0a3-4266-aeb1-fc07965d8c35") : secret "frr-k8s-webhook-server-cert" not found Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.612911 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-metrics\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.613133 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-frr-conf\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.613179 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5f685485-23a9-45dd-90cd-62ab47eab713-frr-sockets\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: E0202 13:18:37.613204 4721 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 02 13:18:37 crc kubenswrapper[4721]: E0202 13:18:37.613229 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f685485-23a9-45dd-90cd-62ab47eab713-metrics-certs podName:5f685485-23a9-45dd-90cd-62ab47eab713 nodeName:}" failed. No retries permitted until 2026-02-02 13:18:38.113221938 +0000 UTC m=+1058.415736327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5f685485-23a9-45dd-90cd-62ab47eab713-metrics-certs") pod "frr-k8s-8ts6n" (UID: "5f685485-23a9-45dd-90cd-62ab47eab713") : secret "frr-k8s-certs-secret" not found Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.616651 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-rq76j"] Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.616912 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5f685485-23a9-45dd-90cd-62ab47eab713-frr-startup\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.635033 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2wbp\" (UniqueName: \"kubernetes.io/projected/5f685485-23a9-45dd-90cd-62ab47eab713-kube-api-access-n2wbp\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.660431 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwfqg\" (UniqueName: \"kubernetes.io/projected/4fda33e0-d0a3-4266-aeb1-fc07965d8c35-kube-api-access-mwfqg\") pod \"frr-k8s-webhook-server-7df86c4f6c-4t8pn\" (UID: \"4fda33e0-d0a3-4266-aeb1-fc07965d8c35\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.713878 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-memberlist\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.713956 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/486fb2e8-15fe-46c1-b62c-89f2b2abf064-metallb-excludel2\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.714015 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8fb94c8-b6a7-47c1-bf64-c01350b47983-metrics-certs\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.714036 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4shm8\" (UniqueName: \"kubernetes.io/projected/486fb2e8-15fe-46c1-b62c-89f2b2abf064-kube-api-access-4shm8\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.714184 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8fb94c8-b6a7-47c1-bf64-c01350b47983-cert\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.714220 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz9b8\" (UniqueName: \"kubernetes.io/projected/d8fb94c8-b6a7-47c1-bf64-c01350b47983-kube-api-access-tz9b8\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.714240 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-metrics-certs\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.815508 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/486fb2e8-15fe-46c1-b62c-89f2b2abf064-metallb-excludel2\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.815592 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8fb94c8-b6a7-47c1-bf64-c01350b47983-metrics-certs\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.815620 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4shm8\" (UniqueName: \"kubernetes.io/projected/486fb2e8-15fe-46c1-b62c-89f2b2abf064-kube-api-access-4shm8\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.815706 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8fb94c8-b6a7-47c1-bf64-c01350b47983-cert\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:37 crc kubenswrapper[4721]: E0202 13:18:37.815776 4721 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 02 13:18:37 crc kubenswrapper[4721]: E0202 13:18:37.815866 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8fb94c8-b6a7-47c1-bf64-c01350b47983-metrics-certs podName:d8fb94c8-b6a7-47c1-bf64-c01350b47983 nodeName:}" failed. No retries permitted until 2026-02-02 13:18:38.315839737 +0000 UTC m=+1058.618354206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d8fb94c8-b6a7-47c1-bf64-c01350b47983-metrics-certs") pod "controller-6968d8fdc4-rq76j" (UID: "d8fb94c8-b6a7-47c1-bf64-c01350b47983") : secret "controller-certs-secret" not found Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.816032 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz9b8\" (UniqueName: \"kubernetes.io/projected/d8fb94c8-b6a7-47c1-bf64-c01350b47983-kube-api-access-tz9b8\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.816108 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-metrics-certs\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.816252 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-memberlist\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: E0202 13:18:37.816455 4721 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 02 13:18:37 crc kubenswrapper[4721]: E0202 13:18:37.816502 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-memberlist podName:486fb2e8-15fe-46c1-b62c-89f2b2abf064 nodeName:}" failed. No retries permitted until 2026-02-02 13:18:38.316486694 +0000 UTC m=+1058.619001083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-memberlist") pod "speaker-2hhvl" (UID: "486fb2e8-15fe-46c1-b62c-89f2b2abf064") : secret "metallb-memberlist" not found Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.817237 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/486fb2e8-15fe-46c1-b62c-89f2b2abf064-metallb-excludel2\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.818608 4721 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.821612 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-metrics-certs\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.831910 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8fb94c8-b6a7-47c1-bf64-c01350b47983-cert\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.838333 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4shm8\" (UniqueName: \"kubernetes.io/projected/486fb2e8-15fe-46c1-b62c-89f2b2abf064-kube-api-access-4shm8\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:37 crc kubenswrapper[4721]: I0202 13:18:37.841662 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz9b8\" (UniqueName: \"kubernetes.io/projected/d8fb94c8-b6a7-47c1-bf64-c01350b47983-kube-api-access-tz9b8\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.121015 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fda33e0-d0a3-4266-aeb1-fc07965d8c35-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-4t8pn\" (UID: \"4fda33e0-d0a3-4266-aeb1-fc07965d8c35\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.121156 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f685485-23a9-45dd-90cd-62ab47eab713-metrics-certs\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.124570 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5f685485-23a9-45dd-90cd-62ab47eab713-metrics-certs\") pod \"frr-k8s-8ts6n\" (UID: \"5f685485-23a9-45dd-90cd-62ab47eab713\") " pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.124697 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4fda33e0-d0a3-4266-aeb1-fc07965d8c35-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-4t8pn\" (UID: \"4fda33e0-d0a3-4266-aeb1-fc07965d8c35\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.324057 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-memberlist\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.324214 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8fb94c8-b6a7-47c1-bf64-c01350b47983-metrics-certs\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:38 crc kubenswrapper[4721]: E0202 13:18:38.324228 4721 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 02 13:18:38 crc kubenswrapper[4721]: E0202 13:18:38.324294 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-memberlist podName:486fb2e8-15fe-46c1-b62c-89f2b2abf064 nodeName:}" failed. No retries permitted until 2026-02-02 13:18:39.324276561 +0000 UTC m=+1059.626790950 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-memberlist") pod "speaker-2hhvl" (UID: "486fb2e8-15fe-46c1-b62c-89f2b2abf064") : secret "metallb-memberlist" not found Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.327882 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d8fb94c8-b6a7-47c1-bf64-c01350b47983-metrics-certs\") pod \"controller-6968d8fdc4-rq76j\" (UID: \"d8fb94c8-b6a7-47c1-bf64-c01350b47983\") " pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.370574 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.377335 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.491372 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.787204 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn"] Feb 02 13:18:38 crc kubenswrapper[4721]: I0202 13:18:38.788742 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerStarted","Data":"3c527d57a08a523721a1d2de7391df538f635da0ad2beb4688f31b450cf2070b"} Feb 02 13:18:38 crc kubenswrapper[4721]: W0202 13:18:38.792744 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fda33e0_d0a3_4266_aeb1_fc07965d8c35.slice/crio-22e4ed4f177ecfb7f4d8515b57e888c9baa5330adfb250d0dc3d7cb511d4ba6e WatchSource:0}: Error finding container 22e4ed4f177ecfb7f4d8515b57e888c9baa5330adfb250d0dc3d7cb511d4ba6e: Status 404 returned error can't find the container with id 22e4ed4f177ecfb7f4d8515b57e888c9baa5330adfb250d0dc3d7cb511d4ba6e Feb 02 13:18:39 crc kubenswrapper[4721]: W0202 13:18:39.015624 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8fb94c8_b6a7_47c1_bf64_c01350b47983.slice/crio-e2a18b619088f9a7400300bce19947499337bdf619bcac5b0697c49046b670ce WatchSource:0}: Error finding container e2a18b619088f9a7400300bce19947499337bdf619bcac5b0697c49046b670ce: Status 404 returned error can't find the container with id e2a18b619088f9a7400300bce19947499337bdf619bcac5b0697c49046b670ce Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.017189 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-rq76j"] Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.342872 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-memberlist\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.351889 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/486fb2e8-15fe-46c1-b62c-89f2b2abf064-memberlist\") pod \"speaker-2hhvl\" (UID: \"486fb2e8-15fe-46c1-b62c-89f2b2abf064\") " pod="metallb-system/speaker-2hhvl" Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.367796 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2hhvl" Feb 02 13:18:39 crc kubenswrapper[4721]: W0202 13:18:39.411074 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod486fb2e8_15fe_46c1_b62c_89f2b2abf064.slice/crio-b3e8d397e687e845a1179eb25a3f460b85adfe038222c9a0ebf9a5837ef9f753 WatchSource:0}: Error finding container b3e8d397e687e845a1179eb25a3f460b85adfe038222c9a0ebf9a5837ef9f753: Status 404 returned error can't find the container with id b3e8d397e687e845a1179eb25a3f460b85adfe038222c9a0ebf9a5837ef9f753 Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.808203 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" event={"ID":"4fda33e0-d0a3-4266-aeb1-fc07965d8c35","Type":"ContainerStarted","Data":"22e4ed4f177ecfb7f4d8515b57e888c9baa5330adfb250d0dc3d7cb511d4ba6e"} Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.812150 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2hhvl" event={"ID":"486fb2e8-15fe-46c1-b62c-89f2b2abf064","Type":"ContainerStarted","Data":"4e01c83cc4931fdc3383462b8c5d90108b3cba3fc105a860ae6d9f279d62b7b8"} Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.812190 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2hhvl" event={"ID":"486fb2e8-15fe-46c1-b62c-89f2b2abf064","Type":"ContainerStarted","Data":"b3e8d397e687e845a1179eb25a3f460b85adfe038222c9a0ebf9a5837ef9f753"} Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.815447 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-rq76j" event={"ID":"d8fb94c8-b6a7-47c1-bf64-c01350b47983","Type":"ContainerStarted","Data":"d8803cd645e8d946431f086660924d525d3399b7fecc9d2feb715e6cf0f502d3"} Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.815473 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-rq76j" event={"ID":"d8fb94c8-b6a7-47c1-bf64-c01350b47983","Type":"ContainerStarted","Data":"3c1022a20bdd14d59f969f522a603244b11617cfc63c8b6f56b86319e605b77f"} Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.815483 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-rq76j" event={"ID":"d8fb94c8-b6a7-47c1-bf64-c01350b47983","Type":"ContainerStarted","Data":"e2a18b619088f9a7400300bce19947499337bdf619bcac5b0697c49046b670ce"} Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.816350 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:39 crc kubenswrapper[4721]: I0202 13:18:39.850302 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-rq76j" podStartSLOduration=2.850283362 podStartE2EDuration="2.850283362s" podCreationTimestamp="2026-02-02 13:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:18:39.845835523 +0000 UTC m=+1060.148349922" watchObservedRunningTime="2026-02-02 13:18:39.850283362 +0000 UTC m=+1060.152797751" Feb 02 13:18:40 crc kubenswrapper[4721]: I0202 13:18:40.830171 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2hhvl" event={"ID":"486fb2e8-15fe-46c1-b62c-89f2b2abf064","Type":"ContainerStarted","Data":"20db97f9c0ed2d54ccc6d3e4d3d55445f8b7bc1df45967185d9d66a3718eaa3d"} Feb 02 13:18:40 crc kubenswrapper[4721]: I0202 13:18:40.856618 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2hhvl" podStartSLOduration=3.856599817 podStartE2EDuration="3.856599817s" podCreationTimestamp="2026-02-02 13:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:18:40.849958728 +0000 UTC m=+1061.152473127" watchObservedRunningTime="2026-02-02 13:18:40.856599817 +0000 UTC m=+1061.159114206" Feb 02 13:18:41 crc kubenswrapper[4721]: I0202 13:18:41.840044 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2hhvl" Feb 02 13:18:44 crc kubenswrapper[4721]: I0202 13:18:44.763545 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:18:44 crc kubenswrapper[4721]: I0202 13:18:44.764255 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:18:44 crc kubenswrapper[4721]: I0202 13:18:44.764320 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:18:44 crc kubenswrapper[4721]: I0202 13:18:44.882171 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e250ca33160c82bc83b5c1d01cc482ebd55cdb3c1b9ae291d6af786cb617e66"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:18:44 crc kubenswrapper[4721]: I0202 13:18:44.882255 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://3e250ca33160c82bc83b5c1d01cc482ebd55cdb3c1b9ae291d6af786cb617e66" gracePeriod=600 Feb 02 13:18:45 crc kubenswrapper[4721]: I0202 13:18:45.892038 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="3e250ca33160c82bc83b5c1d01cc482ebd55cdb3c1b9ae291d6af786cb617e66" exitCode=0 Feb 02 13:18:45 crc kubenswrapper[4721]: I0202 13:18:45.892116 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"3e250ca33160c82bc83b5c1d01cc482ebd55cdb3c1b9ae291d6af786cb617e66"} Feb 02 13:18:45 crc kubenswrapper[4721]: I0202 13:18:45.892378 4721 scope.go:117] "RemoveContainer" containerID="4e271bf7e19d8205d47335a427c173d1e8d60e0f2a6167b224679306973cc1cc" Feb 02 13:18:46 crc kubenswrapper[4721]: I0202 13:18:46.901459 4721 generic.go:334] "Generic (PLEG): container finished" podID="5f685485-23a9-45dd-90cd-62ab47eab713" containerID="03d0fc817d886ad0f0ee26a111fa8a39a98869ad531602532cb9fb0031c9ea49" exitCode=0 Feb 02 13:18:46 crc kubenswrapper[4721]: I0202 13:18:46.901500 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerDied","Data":"03d0fc817d886ad0f0ee26a111fa8a39a98869ad531602532cb9fb0031c9ea49"} Feb 02 13:18:46 crc kubenswrapper[4721]: I0202 13:18:46.903279 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" event={"ID":"4fda33e0-d0a3-4266-aeb1-fc07965d8c35","Type":"ContainerStarted","Data":"315eeb7e4fb2b03ec8db702044e7c5ba8604f9647e470872dd5a289ffe0ce83a"} Feb 02 13:18:46 crc kubenswrapper[4721]: I0202 13:18:46.903568 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:46 crc kubenswrapper[4721]: I0202 13:18:46.906388 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"56e02e958f304734b98b90c0b35547a7aaeb3ba27ad6cd35ef754f549abd2513"} Feb 02 13:18:46 crc kubenswrapper[4721]: I0202 13:18:46.958020 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" podStartSLOduration=2.978323505 podStartE2EDuration="9.958003139s" podCreationTimestamp="2026-02-02 13:18:37 +0000 UTC" firstStartedPulling="2026-02-02 13:18:38.799128932 +0000 UTC m=+1059.101643321" lastFinishedPulling="2026-02-02 13:18:45.778808566 +0000 UTC m=+1066.081322955" observedRunningTime="2026-02-02 13:18:46.953460047 +0000 UTC m=+1067.255974436" watchObservedRunningTime="2026-02-02 13:18:46.958003139 +0000 UTC m=+1067.260517528" Feb 02 13:18:47 crc kubenswrapper[4721]: I0202 13:18:47.916496 4721 generic.go:334] "Generic (PLEG): container finished" podID="5f685485-23a9-45dd-90cd-62ab47eab713" containerID="1dd6f8542f878a906dd085b4da23ddb4399ef98f555265fd1c4109c17abb988b" exitCode=0 Feb 02 13:18:47 crc kubenswrapper[4721]: I0202 13:18:47.916560 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerDied","Data":"1dd6f8542f878a906dd085b4da23ddb4399ef98f555265fd1c4109c17abb988b"} Feb 02 13:18:48 crc kubenswrapper[4721]: E0202 13:18:48.192011 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f685485_23a9_45dd_90cd_62ab47eab713.slice/crio-conmon-64bb773d72a6c4efd1a2ed88c4109619b1ebdec1a5b49cd18c11cc35b997b852.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f685485_23a9_45dd_90cd_62ab47eab713.slice/crio-64bb773d72a6c4efd1a2ed88c4109619b1ebdec1a5b49cd18c11cc35b997b852.scope\": RecentStats: unable to find data in memory cache]" Feb 02 13:18:48 crc kubenswrapper[4721]: E0202 13:18:48.192134 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f685485_23a9_45dd_90cd_62ab47eab713.slice/crio-64bb773d72a6c4efd1a2ed88c4109619b1ebdec1a5b49cd18c11cc35b997b852.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f685485_23a9_45dd_90cd_62ab47eab713.slice/crio-conmon-64bb773d72a6c4efd1a2ed88c4109619b1ebdec1a5b49cd18c11cc35b997b852.scope\": RecentStats: unable to find data in memory cache]" Feb 02 13:18:49 crc kubenswrapper[4721]: I0202 13:18:49.141018 4721 generic.go:334] "Generic (PLEG): container finished" podID="5f685485-23a9-45dd-90cd-62ab47eab713" containerID="64bb773d72a6c4efd1a2ed88c4109619b1ebdec1a5b49cd18c11cc35b997b852" exitCode=0 Feb 02 13:18:49 crc kubenswrapper[4721]: I0202 13:18:49.141126 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerDied","Data":"64bb773d72a6c4efd1a2ed88c4109619b1ebdec1a5b49cd18c11cc35b997b852"} Feb 02 13:18:49 crc kubenswrapper[4721]: I0202 13:18:49.373195 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2hhvl" Feb 02 13:18:50 crc kubenswrapper[4721]: I0202 13:18:50.155867 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerStarted","Data":"5e48b3a0ec805fb51069f290912489d96cb60f8b72eae9fa99a8c8fc7c14f2be"} Feb 02 13:18:50 crc kubenswrapper[4721]: I0202 13:18:50.156198 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerStarted","Data":"a64fa8b85d8bbf563e1cc9518dd9eb6c3a897f8cd2422104902533866289d43c"} Feb 02 13:18:50 crc kubenswrapper[4721]: I0202 13:18:50.156229 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerStarted","Data":"c6adc3f6a792763cfb4a522e36a65254e33d17da87f5bce9cff96ee64cdba71a"} Feb 02 13:18:50 crc kubenswrapper[4721]: I0202 13:18:50.156240 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerStarted","Data":"a6a02cd5e1ffaadb75893b390ecd635f00187ec0edde1848d141cf08f58a4b71"} Feb 02 13:18:50 crc kubenswrapper[4721]: I0202 13:18:50.156248 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerStarted","Data":"62bb24741d74bc1b1630da97df0f7bd565db0465def4b5400f8055ee3f909bef"} Feb 02 13:18:51 crc kubenswrapper[4721]: I0202 13:18:51.171778 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8ts6n" event={"ID":"5f685485-23a9-45dd-90cd-62ab47eab713","Type":"ContainerStarted","Data":"7639a0f645b1feb6aeacbacb7cffe21d371b2fa359e6af58f1d5c26c1e543e96"} Feb 02 13:18:51 crc kubenswrapper[4721]: I0202 13:18:51.173620 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:51 crc kubenswrapper[4721]: I0202 13:18:51.194125 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8ts6n" podStartSLOduration=6.948643856 podStartE2EDuration="14.194102957s" podCreationTimestamp="2026-02-02 13:18:37 +0000 UTC" firstStartedPulling="2026-02-02 13:18:38.509993456 +0000 UTC m=+1058.812507845" lastFinishedPulling="2026-02-02 13:18:45.755452557 +0000 UTC m=+1066.057966946" observedRunningTime="2026-02-02 13:18:51.193317445 +0000 UTC m=+1071.495831834" watchObservedRunningTime="2026-02-02 13:18:51.194102957 +0000 UTC m=+1071.496617366" Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.092541 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-m68ht"] Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.094158 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m68ht" Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.095667 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-f4l7g" Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.097064 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.097283 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.105398 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m68ht"] Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.182426 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdsw8\" (UniqueName: \"kubernetes.io/projected/0014b6b6-c71c-4e95-8297-3eb2fdc64a74-kube-api-access-kdsw8\") pod \"openstack-operator-index-m68ht\" (UID: \"0014b6b6-c71c-4e95-8297-3eb2fdc64a74\") " pod="openstack-operators/openstack-operator-index-m68ht" Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.284290 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdsw8\" (UniqueName: \"kubernetes.io/projected/0014b6b6-c71c-4e95-8297-3eb2fdc64a74-kube-api-access-kdsw8\") pod \"openstack-operator-index-m68ht\" (UID: \"0014b6b6-c71c-4e95-8297-3eb2fdc64a74\") " pod="openstack-operators/openstack-operator-index-m68ht" Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.304839 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdsw8\" (UniqueName: \"kubernetes.io/projected/0014b6b6-c71c-4e95-8297-3eb2fdc64a74-kube-api-access-kdsw8\") pod \"openstack-operator-index-m68ht\" (UID: \"0014b6b6-c71c-4e95-8297-3eb2fdc64a74\") " pod="openstack-operators/openstack-operator-index-m68ht" Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.413128 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m68ht" Feb 02 13:18:52 crc kubenswrapper[4721]: I0202 13:18:52.843461 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m68ht"] Feb 02 13:18:53 crc kubenswrapper[4721]: I0202 13:18:53.188540 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m68ht" event={"ID":"0014b6b6-c71c-4e95-8297-3eb2fdc64a74","Type":"ContainerStarted","Data":"3249ab65e4b96c53173fa3e46efe93bbc8beaf0a9ad7e6942999d6268e128321"} Feb 02 13:18:53 crc kubenswrapper[4721]: I0202 13:18:53.371920 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:53 crc kubenswrapper[4721]: I0202 13:18:53.409755 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:18:56 crc kubenswrapper[4721]: I0202 13:18:56.215127 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m68ht" event={"ID":"0014b6b6-c71c-4e95-8297-3eb2fdc64a74","Type":"ContainerStarted","Data":"da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb"} Feb 02 13:18:56 crc kubenswrapper[4721]: I0202 13:18:56.229891 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-m68ht" podStartSLOduration=1.6442831820000001 podStartE2EDuration="4.229863669s" podCreationTimestamp="2026-02-02 13:18:52 +0000 UTC" firstStartedPulling="2026-02-02 13:18:52.854540703 +0000 UTC m=+1073.157055092" lastFinishedPulling="2026-02-02 13:18:55.44012119 +0000 UTC m=+1075.742635579" observedRunningTime="2026-02-02 13:18:56.227875606 +0000 UTC m=+1076.530390035" watchObservedRunningTime="2026-02-02 13:18:56.229863669 +0000 UTC m=+1076.532378108" Feb 02 13:18:56 crc kubenswrapper[4721]: I0202 13:18:56.258111 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-m68ht"] Feb 02 13:18:57 crc kubenswrapper[4721]: I0202 13:18:57.001242 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-lxqsx"] Feb 02 13:18:57 crc kubenswrapper[4721]: I0202 13:18:57.003334 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lxqsx" Feb 02 13:18:57 crc kubenswrapper[4721]: I0202 13:18:57.012871 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lxqsx"] Feb 02 13:18:57 crc kubenswrapper[4721]: I0202 13:18:57.058719 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5b94\" (UniqueName: \"kubernetes.io/projected/abf13eed-433d-4afa-809d-bd863e469366-kube-api-access-b5b94\") pod \"openstack-operator-index-lxqsx\" (UID: \"abf13eed-433d-4afa-809d-bd863e469366\") " pod="openstack-operators/openstack-operator-index-lxqsx" Feb 02 13:18:57 crc kubenswrapper[4721]: I0202 13:18:57.160621 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5b94\" (UniqueName: \"kubernetes.io/projected/abf13eed-433d-4afa-809d-bd863e469366-kube-api-access-b5b94\") pod \"openstack-operator-index-lxqsx\" (UID: \"abf13eed-433d-4afa-809d-bd863e469366\") " pod="openstack-operators/openstack-operator-index-lxqsx" Feb 02 13:18:57 crc kubenswrapper[4721]: I0202 13:18:57.179564 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5b94\" (UniqueName: \"kubernetes.io/projected/abf13eed-433d-4afa-809d-bd863e469366-kube-api-access-b5b94\") pod \"openstack-operator-index-lxqsx\" (UID: \"abf13eed-433d-4afa-809d-bd863e469366\") " pod="openstack-operators/openstack-operator-index-lxqsx" Feb 02 13:18:57 crc kubenswrapper[4721]: I0202 13:18:57.326257 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-lxqsx" Feb 02 13:18:57 crc kubenswrapper[4721]: I0202 13:18:57.723065 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-lxqsx"] Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.233555 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lxqsx" event={"ID":"abf13eed-433d-4afa-809d-bd863e469366","Type":"ContainerStarted","Data":"78af90db2a894f83627a56f1ee1bfc25c6064e4003435476a66e6e31796176d2"} Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.233673 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-m68ht" podUID="0014b6b6-c71c-4e95-8297-3eb2fdc64a74" containerName="registry-server" containerID="cri-o://da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb" gracePeriod=2 Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.233634 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-lxqsx" event={"ID":"abf13eed-433d-4afa-809d-bd863e469366","Type":"ContainerStarted","Data":"7b001626c00bd24664dce0b373d83fc851a0cfda003589d7497034a9c0d8d9ae"} Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.254616 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-lxqsx" podStartSLOduration=2.200363574 podStartE2EDuration="2.254595923s" podCreationTimestamp="2026-02-02 13:18:56 +0000 UTC" firstStartedPulling="2026-02-02 13:18:57.726415228 +0000 UTC m=+1078.028929617" lastFinishedPulling="2026-02-02 13:18:57.780647577 +0000 UTC m=+1078.083161966" observedRunningTime="2026-02-02 13:18:58.247058911 +0000 UTC m=+1078.549573310" watchObservedRunningTime="2026-02-02 13:18:58.254595923 +0000 UTC m=+1078.557110322" Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.393229 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-4t8pn" Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.496831 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-rq76j" Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.714356 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m68ht" Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.892583 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdsw8\" (UniqueName: \"kubernetes.io/projected/0014b6b6-c71c-4e95-8297-3eb2fdc64a74-kube-api-access-kdsw8\") pod \"0014b6b6-c71c-4e95-8297-3eb2fdc64a74\" (UID: \"0014b6b6-c71c-4e95-8297-3eb2fdc64a74\") " Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.898692 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0014b6b6-c71c-4e95-8297-3eb2fdc64a74-kube-api-access-kdsw8" (OuterVolumeSpecName: "kube-api-access-kdsw8") pod "0014b6b6-c71c-4e95-8297-3eb2fdc64a74" (UID: "0014b6b6-c71c-4e95-8297-3eb2fdc64a74"). InnerVolumeSpecName "kube-api-access-kdsw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:18:58 crc kubenswrapper[4721]: I0202 13:18:58.994086 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdsw8\" (UniqueName: \"kubernetes.io/projected/0014b6b6-c71c-4e95-8297-3eb2fdc64a74-kube-api-access-kdsw8\") on node \"crc\" DevicePath \"\"" Feb 02 13:18:59 crc kubenswrapper[4721]: I0202 13:18:59.240779 4721 generic.go:334] "Generic (PLEG): container finished" podID="0014b6b6-c71c-4e95-8297-3eb2fdc64a74" containerID="da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb" exitCode=0 Feb 02 13:18:59 crc kubenswrapper[4721]: I0202 13:18:59.240868 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m68ht" Feb 02 13:18:59 crc kubenswrapper[4721]: I0202 13:18:59.240876 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m68ht" event={"ID":"0014b6b6-c71c-4e95-8297-3eb2fdc64a74","Type":"ContainerDied","Data":"da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb"} Feb 02 13:18:59 crc kubenswrapper[4721]: I0202 13:18:59.240933 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m68ht" event={"ID":"0014b6b6-c71c-4e95-8297-3eb2fdc64a74","Type":"ContainerDied","Data":"3249ab65e4b96c53173fa3e46efe93bbc8beaf0a9ad7e6942999d6268e128321"} Feb 02 13:18:59 crc kubenswrapper[4721]: I0202 13:18:59.240951 4721 scope.go:117] "RemoveContainer" containerID="da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb" Feb 02 13:18:59 crc kubenswrapper[4721]: I0202 13:18:59.258913 4721 scope.go:117] "RemoveContainer" containerID="da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb" Feb 02 13:18:59 crc kubenswrapper[4721]: E0202 13:18:59.259419 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb\": container with ID starting with da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb not found: ID does not exist" containerID="da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb" Feb 02 13:18:59 crc kubenswrapper[4721]: I0202 13:18:59.259451 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb"} err="failed to get container status \"da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb\": rpc error: code = NotFound desc = could not find container \"da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb\": container with ID starting with da2a1fef91a542e480ee355b06c71667fc75ede6837e2adeb932832367662bcb not found: ID does not exist" Feb 02 13:18:59 crc kubenswrapper[4721]: I0202 13:18:59.274422 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-m68ht"] Feb 02 13:18:59 crc kubenswrapper[4721]: I0202 13:18:59.280378 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-m68ht"] Feb 02 13:19:00 crc kubenswrapper[4721]: I0202 13:19:00.418827 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0014b6b6-c71c-4e95-8297-3eb2fdc64a74" path="/var/lib/kubelet/pods/0014b6b6-c71c-4e95-8297-3eb2fdc64a74/volumes" Feb 02 13:19:07 crc kubenswrapper[4721]: I0202 13:19:07.326847 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-lxqsx" Feb 02 13:19:07 crc kubenswrapper[4721]: I0202 13:19:07.327396 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-lxqsx" Feb 02 13:19:07 crc kubenswrapper[4721]: I0202 13:19:07.365247 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-lxqsx" Feb 02 13:19:08 crc kubenswrapper[4721]: I0202 13:19:08.336801 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-lxqsx" Feb 02 13:19:08 crc kubenswrapper[4721]: I0202 13:19:08.374244 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8ts6n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.115707 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n"] Feb 02 13:19:15 crc kubenswrapper[4721]: E0202 13:19:15.116582 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0014b6b6-c71c-4e95-8297-3eb2fdc64a74" containerName="registry-server" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.116595 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0014b6b6-c71c-4e95-8297-3eb2fdc64a74" containerName="registry-server" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.116931 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0014b6b6-c71c-4e95-8297-3eb2fdc64a74" containerName="registry-server" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.118253 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.122231 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ch7pw" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.124003 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n"] Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.191445 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckmx9\" (UniqueName: \"kubernetes.io/projected/5c48ead1-c06b-4a13-b92a-ce7a474e6233-kube-api-access-ckmx9\") pod \"37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.191514 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-util\") pod \"37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.191594 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-bundle\") pod \"37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.293835 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-bundle\") pod \"37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.294030 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckmx9\" (UniqueName: \"kubernetes.io/projected/5c48ead1-c06b-4a13-b92a-ce7a474e6233-kube-api-access-ckmx9\") pod \"37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.294058 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-util\") pod \"37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.294631 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-util\") pod \"37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.296557 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-bundle\") pod \"37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.318679 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckmx9\" (UniqueName: \"kubernetes.io/projected/5c48ead1-c06b-4a13-b92a-ce7a474e6233-kube-api-access-ckmx9\") pod \"37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.444589 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:15 crc kubenswrapper[4721]: I0202 13:19:15.859012 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n"] Feb 02 13:19:15 crc kubenswrapper[4721]: W0202 13:19:15.869292 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c48ead1_c06b_4a13_b92a_ce7a474e6233.slice/crio-d0d8199e845beed805cad86e56edb529277d411808ef589eb129d8a543059372 WatchSource:0}: Error finding container d0d8199e845beed805cad86e56edb529277d411808ef589eb129d8a543059372: Status 404 returned error can't find the container with id d0d8199e845beed805cad86e56edb529277d411808ef589eb129d8a543059372 Feb 02 13:19:16 crc kubenswrapper[4721]: I0202 13:19:16.368165 4721 generic.go:334] "Generic (PLEG): container finished" podID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerID="a175b67ada1bce963fa1271e6de88c07606188a982974922197fec921731eed6" exitCode=0 Feb 02 13:19:16 crc kubenswrapper[4721]: I0202 13:19:16.368214 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" event={"ID":"5c48ead1-c06b-4a13-b92a-ce7a474e6233","Type":"ContainerDied","Data":"a175b67ada1bce963fa1271e6de88c07606188a982974922197fec921731eed6"} Feb 02 13:19:16 crc kubenswrapper[4721]: I0202 13:19:16.368245 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" event={"ID":"5c48ead1-c06b-4a13-b92a-ce7a474e6233","Type":"ContainerStarted","Data":"d0d8199e845beed805cad86e56edb529277d411808ef589eb129d8a543059372"} Feb 02 13:19:17 crc kubenswrapper[4721]: I0202 13:19:17.377806 4721 generic.go:334] "Generic (PLEG): container finished" podID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerID="87ed60fc00e9a438b589227413ad2523c5bbe2c862bd494d5c233bc88185aa3e" exitCode=0 Feb 02 13:19:17 crc kubenswrapper[4721]: I0202 13:19:17.377888 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" event={"ID":"5c48ead1-c06b-4a13-b92a-ce7a474e6233","Type":"ContainerDied","Data":"87ed60fc00e9a438b589227413ad2523c5bbe2c862bd494d5c233bc88185aa3e"} Feb 02 13:19:18 crc kubenswrapper[4721]: I0202 13:19:18.387744 4721 generic.go:334] "Generic (PLEG): container finished" podID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerID="0f4fa58b324d00f07fa91a0d980dd156f0ca293cb3afe96b229c216b4a1ef522" exitCode=0 Feb 02 13:19:18 crc kubenswrapper[4721]: I0202 13:19:18.387797 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" event={"ID":"5c48ead1-c06b-4a13-b92a-ce7a474e6233","Type":"ContainerDied","Data":"0f4fa58b324d00f07fa91a0d980dd156f0ca293cb3afe96b229c216b4a1ef522"} Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.701437 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.770606 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-bundle\") pod \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.770687 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckmx9\" (UniqueName: \"kubernetes.io/projected/5c48ead1-c06b-4a13-b92a-ce7a474e6233-kube-api-access-ckmx9\") pod \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.770724 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-util\") pod \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\" (UID: \"5c48ead1-c06b-4a13-b92a-ce7a474e6233\") " Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.771898 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-bundle" (OuterVolumeSpecName: "bundle") pod "5c48ead1-c06b-4a13-b92a-ce7a474e6233" (UID: "5c48ead1-c06b-4a13-b92a-ce7a474e6233"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.776682 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c48ead1-c06b-4a13-b92a-ce7a474e6233-kube-api-access-ckmx9" (OuterVolumeSpecName: "kube-api-access-ckmx9") pod "5c48ead1-c06b-4a13-b92a-ce7a474e6233" (UID: "5c48ead1-c06b-4a13-b92a-ce7a474e6233"). InnerVolumeSpecName "kube-api-access-ckmx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.787897 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-util" (OuterVolumeSpecName: "util") pod "5c48ead1-c06b-4a13-b92a-ce7a474e6233" (UID: "5c48ead1-c06b-4a13-b92a-ce7a474e6233"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.872197 4721 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.872237 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckmx9\" (UniqueName: \"kubernetes.io/projected/5c48ead1-c06b-4a13-b92a-ce7a474e6233-kube-api-access-ckmx9\") on node \"crc\" DevicePath \"\"" Feb 02 13:19:19 crc kubenswrapper[4721]: I0202 13:19:19.872252 4721 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c48ead1-c06b-4a13-b92a-ce7a474e6233-util\") on node \"crc\" DevicePath \"\"" Feb 02 13:19:20 crc kubenswrapper[4721]: I0202 13:19:20.406778 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" event={"ID":"5c48ead1-c06b-4a13-b92a-ce7a474e6233","Type":"ContainerDied","Data":"d0d8199e845beed805cad86e56edb529277d411808ef589eb129d8a543059372"} Feb 02 13:19:20 crc kubenswrapper[4721]: I0202 13:19:20.406839 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n" Feb 02 13:19:20 crc kubenswrapper[4721]: I0202 13:19:20.406862 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0d8199e845beed805cad86e56edb529277d411808ef589eb129d8a543059372" Feb 02 13:19:25 crc kubenswrapper[4721]: I0202 13:19:25.968998 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl"] Feb 02 13:19:25 crc kubenswrapper[4721]: E0202 13:19:25.970663 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerName="util" Feb 02 13:19:25 crc kubenswrapper[4721]: I0202 13:19:25.970743 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerName="util" Feb 02 13:19:25 crc kubenswrapper[4721]: E0202 13:19:25.970806 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerName="extract" Feb 02 13:19:25 crc kubenswrapper[4721]: I0202 13:19:25.970885 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerName="extract" Feb 02 13:19:25 crc kubenswrapper[4721]: E0202 13:19:25.970953 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerName="pull" Feb 02 13:19:25 crc kubenswrapper[4721]: I0202 13:19:25.971014 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerName="pull" Feb 02 13:19:25 crc kubenswrapper[4721]: I0202 13:19:25.971268 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c48ead1-c06b-4a13-b92a-ce7a474e6233" containerName="extract" Feb 02 13:19:25 crc kubenswrapper[4721]: I0202 13:19:25.972013 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" Feb 02 13:19:25 crc kubenswrapper[4721]: I0202 13:19:25.975487 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-f9rpt" Feb 02 13:19:26 crc kubenswrapper[4721]: I0202 13:19:26.004229 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl"] Feb 02 13:19:26 crc kubenswrapper[4721]: I0202 13:19:26.075307 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dt7x\" (UniqueName: \"kubernetes.io/projected/e4514067-762e-4638-ad5a-a7d17297bc0d-kube-api-access-7dt7x\") pod \"openstack-operator-controller-init-b64b9f5cb-mqpbl\" (UID: \"e4514067-762e-4638-ad5a-a7d17297bc0d\") " pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" Feb 02 13:19:26 crc kubenswrapper[4721]: I0202 13:19:26.177583 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dt7x\" (UniqueName: \"kubernetes.io/projected/e4514067-762e-4638-ad5a-a7d17297bc0d-kube-api-access-7dt7x\") pod \"openstack-operator-controller-init-b64b9f5cb-mqpbl\" (UID: \"e4514067-762e-4638-ad5a-a7d17297bc0d\") " pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" Feb 02 13:19:26 crc kubenswrapper[4721]: I0202 13:19:26.208960 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dt7x\" (UniqueName: \"kubernetes.io/projected/e4514067-762e-4638-ad5a-a7d17297bc0d-kube-api-access-7dt7x\") pod \"openstack-operator-controller-init-b64b9f5cb-mqpbl\" (UID: \"e4514067-762e-4638-ad5a-a7d17297bc0d\") " pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" Feb 02 13:19:26 crc kubenswrapper[4721]: I0202 13:19:26.295059 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" Feb 02 13:19:26 crc kubenswrapper[4721]: I0202 13:19:26.754449 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl"] Feb 02 13:19:27 crc kubenswrapper[4721]: I0202 13:19:27.459514 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" event={"ID":"e4514067-762e-4638-ad5a-a7d17297bc0d","Type":"ContainerStarted","Data":"29d5dae63b32855362a786b7f3dddc85373ac96887f27f9ff95144eeaf71899d"} Feb 02 13:19:31 crc kubenswrapper[4721]: I0202 13:19:31.501783 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" event={"ID":"e4514067-762e-4638-ad5a-a7d17297bc0d","Type":"ContainerStarted","Data":"482c43c1cef1082dfd20266dc55096aad6c262195b47f36e5f97c82bfce4c18c"} Feb 02 13:19:31 crc kubenswrapper[4721]: I0202 13:19:31.502401 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" Feb 02 13:19:31 crc kubenswrapper[4721]: I0202 13:19:31.533625 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" podStartSLOduration=2.722034888 podStartE2EDuration="6.533608828s" podCreationTimestamp="2026-02-02 13:19:25 +0000 UTC" firstStartedPulling="2026-02-02 13:19:26.765085274 +0000 UTC m=+1107.067599653" lastFinishedPulling="2026-02-02 13:19:30.576659204 +0000 UTC m=+1110.879173593" observedRunningTime="2026-02-02 13:19:31.525107988 +0000 UTC m=+1111.827622377" watchObservedRunningTime="2026-02-02 13:19:31.533608828 +0000 UTC m=+1111.836123217" Feb 02 13:19:36 crc kubenswrapper[4721]: I0202 13:19:36.311339 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-b64b9f5cb-mqpbl" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.043684 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.045316 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.048219 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dmzhd" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.058755 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.065500 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.066562 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.068207 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-chjw6" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.080623 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.082141 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.101183 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.120677 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6g7pw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.144919 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.146611 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.151221 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-7hb8c" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.155699 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.181246 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.196240 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.197306 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.204605 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-g6t6m" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.213200 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.239351 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p959t\" (UniqueName: \"kubernetes.io/projected/20f771bf-d003-48b0-8e50-0d1217f24b45-kube-api-access-p959t\") pod \"glance-operator-controller-manager-8886f4c47-q5lbf\" (UID: \"20f771bf-d003-48b0-8e50-0d1217f24b45\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.239458 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtj6k\" (UniqueName: \"kubernetes.io/projected/0562e590-1a66-4fbc-862d-833bc1600eac-kube-api-access-wtj6k\") pod \"cinder-operator-controller-manager-8d874c8fc-729mv\" (UID: \"0562e590-1a66-4fbc-862d-833bc1600eac\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.239492 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnfsj\" (UniqueName: \"kubernetes.io/projected/0c1486a5-ee95-4cde-9631-3c7c7aa31ae7-kube-api-access-jnfsj\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-8zlv5\" (UID: \"0c1486a5-ee95-4cde-9631-3c7c7aa31ae7\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.250977 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.252227 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.257440 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-jl5tl" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.260645 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-hktcl"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.270084 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.280434 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kqlr4" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.280638 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.286003 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.294028 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.295448 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.298850 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-4ggtg" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.317731 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-hktcl"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.337495 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.338820 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.340427 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9dc6\" (UniqueName: \"kubernetes.io/projected/e5a04e0d-8a73-4f21-a61d-374d7a5784fb-kube-api-access-k9dc6\") pod \"heat-operator-controller-manager-69d6db494d-sq5w5\" (UID: \"e5a04e0d-8a73-4f21-a61d-374d7a5784fb\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.340505 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtj6k\" (UniqueName: \"kubernetes.io/projected/0562e590-1a66-4fbc-862d-833bc1600eac-kube-api-access-wtj6k\") pod \"cinder-operator-controller-manager-8d874c8fc-729mv\" (UID: \"0562e590-1a66-4fbc-862d-833bc1600eac\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.340540 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48twm\" (UniqueName: \"kubernetes.io/projected/23be57b1-6b3e-4346-93f9-2c45b0562d2b-kube-api-access-48twm\") pod \"designate-operator-controller-manager-6d9697b7f4-s75st\" (UID: \"23be57b1-6b3e-4346-93f9-2c45b0562d2b\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.340576 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnfsj\" (UniqueName: \"kubernetes.io/projected/0c1486a5-ee95-4cde-9631-3c7c7aa31ae7-kube-api-access-jnfsj\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-8zlv5\" (UID: \"0c1486a5-ee95-4cde-9631-3c7c7aa31ae7\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.341240 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p959t\" (UniqueName: \"kubernetes.io/projected/20f771bf-d003-48b0-8e50-0d1217f24b45-kube-api-access-p959t\") pod \"glance-operator-controller-manager-8886f4c47-q5lbf\" (UID: \"20f771bf-d003-48b0-8e50-0d1217f24b45\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.346741 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ln2lt" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.378500 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.378530 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p959t\" (UniqueName: \"kubernetes.io/projected/20f771bf-d003-48b0-8e50-0d1217f24b45-kube-api-access-p959t\") pod \"glance-operator-controller-manager-8886f4c47-q5lbf\" (UID: \"20f771bf-d003-48b0-8e50-0d1217f24b45\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.381742 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtj6k\" (UniqueName: \"kubernetes.io/projected/0562e590-1a66-4fbc-862d-833bc1600eac-kube-api-access-wtj6k\") pod \"cinder-operator-controller-manager-8d874c8fc-729mv\" (UID: \"0562e590-1a66-4fbc-862d-833bc1600eac\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.383701 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.384983 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.394019 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.395354 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.398990 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-t4sx5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.399649 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnfsj\" (UniqueName: \"kubernetes.io/projected/0c1486a5-ee95-4cde-9631-3c7c7aa31ae7-kube-api-access-jnfsj\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-8zlv5\" (UID: \"0c1486a5-ee95-4cde-9631-3c7c7aa31ae7\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.407457 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.470922 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srt9d\" (UniqueName: \"kubernetes.io/projected/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-kube-api-access-srt9d\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.471261 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5s4n\" (UniqueName: \"kubernetes.io/projected/8a86dacc-de73-4b52-994c-3b089ee427cc-kube-api-access-h5s4n\") pod \"horizon-operator-controller-manager-5fb775575f-x6p4t\" (UID: \"8a86dacc-de73-4b52-994c-3b089ee427cc\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.471329 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.471358 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9dc6\" (UniqueName: \"kubernetes.io/projected/e5a04e0d-8a73-4f21-a61d-374d7a5784fb-kube-api-access-k9dc6\") pod \"heat-operator-controller-manager-69d6db494d-sq5w5\" (UID: \"e5a04e0d-8a73-4f21-a61d-374d7a5784fb\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.471387 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9qz2\" (UniqueName: \"kubernetes.io/projected/b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa-kube-api-access-d9qz2\") pod \"keystone-operator-controller-manager-84f48565d4-5vbh8\" (UID: \"b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.471512 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48twm\" (UniqueName: \"kubernetes.io/projected/23be57b1-6b3e-4346-93f9-2c45b0562d2b-kube-api-access-48twm\") pod \"designate-operator-controller-manager-6d9697b7f4-s75st\" (UID: \"23be57b1-6b3e-4346-93f9-2c45b0562d2b\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.471604 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsmbn\" (UniqueName: \"kubernetes.io/projected/a13f2341-6b53-4a7b-b67a-4a1d1846805d-kube-api-access-zsmbn\") pod \"ironic-operator-controller-manager-5f4b8bd54d-qqpfm\" (UID: \"a13f2341-6b53-4a7b-b67a-4a1d1846805d\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.540038 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.541598 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.541617 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.544145 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.544840 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.544887 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.544985 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.552312 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.554878 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.556835 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-fgcn6" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.556906 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9dc6\" (UniqueName: \"kubernetes.io/projected/e5a04e0d-8a73-4f21-a61d-374d7a5784fb-kube-api-access-k9dc6\") pod \"heat-operator-controller-manager-69d6db494d-sq5w5\" (UID: \"e5a04e0d-8a73-4f21-a61d-374d7a5784fb\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.557276 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kvpsj" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.557408 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-w6nvq" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.569942 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48twm\" (UniqueName: \"kubernetes.io/projected/23be57b1-6b3e-4346-93f9-2c45b0562d2b-kube-api-access-48twm\") pod \"designate-operator-controller-manager-6d9697b7f4-s75st\" (UID: \"23be57b1-6b3e-4346-93f9-2c45b0562d2b\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.574540 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsmbn\" (UniqueName: \"kubernetes.io/projected/a13f2341-6b53-4a7b-b67a-4a1d1846805d-kube-api-access-zsmbn\") pod \"ironic-operator-controller-manager-5f4b8bd54d-qqpfm\" (UID: \"a13f2341-6b53-4a7b-b67a-4a1d1846805d\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.574974 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqstl\" (UniqueName: \"kubernetes.io/projected/39686eda-a258-408b-bf9c-7ff7d515ed9d-kube-api-access-rqstl\") pod \"manila-operator-controller-manager-7dd968899f-5x28t\" (UID: \"39686eda-a258-408b-bf9c-7ff7d515ed9d\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.575009 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srt9d\" (UniqueName: \"kubernetes.io/projected/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-kube-api-access-srt9d\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.575037 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5s4n\" (UniqueName: \"kubernetes.io/projected/8a86dacc-de73-4b52-994c-3b089ee427cc-kube-api-access-h5s4n\") pod \"horizon-operator-controller-manager-5fb775575f-x6p4t\" (UID: \"8a86dacc-de73-4b52-994c-3b089ee427cc\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.577128 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-6z258"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.578476 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.584047 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-5fjnh" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.587293 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.587360 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9qz2\" (UniqueName: \"kubernetes.io/projected/b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa-kube-api-access-d9qz2\") pod \"keystone-operator-controller-manager-84f48565d4-5vbh8\" (UID: \"b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" Feb 02 13:20:14 crc kubenswrapper[4721]: E0202 13:20:14.589015 4721 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:14 crc kubenswrapper[4721]: E0202 13:20:14.589176 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert podName:9d11c3e4-10b4-4ff4-aaa2-04e342d984b4 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:15.089157231 +0000 UTC m=+1155.391671620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert") pod "infra-operator-controller-manager-79955696d6-hktcl" (UID: "9d11c3e4-10b4-4ff4-aaa2-04e342d984b4") : secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.616174 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.622737 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsmbn\" (UniqueName: \"kubernetes.io/projected/a13f2341-6b53-4a7b-b67a-4a1d1846805d-kube-api-access-zsmbn\") pod \"ironic-operator-controller-manager-5f4b8bd54d-qqpfm\" (UID: \"a13f2341-6b53-4a7b-b67a-4a1d1846805d\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.629435 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srt9d\" (UniqueName: \"kubernetes.io/projected/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-kube-api-access-srt9d\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.634994 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.652790 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5s4n\" (UniqueName: \"kubernetes.io/projected/8a86dacc-de73-4b52-994c-3b089ee427cc-kube-api-access-h5s4n\") pod \"horizon-operator-controller-manager-5fb775575f-x6p4t\" (UID: \"8a86dacc-de73-4b52-994c-3b089ee427cc\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.663142 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-6z258"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.664792 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9qz2\" (UniqueName: \"kubernetes.io/projected/b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa-kube-api-access-d9qz2\") pod \"keystone-operator-controller-manager-84f48565d4-5vbh8\" (UID: \"b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.668106 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.678294 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.678713 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.682664 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-swlxq" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.682960 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.695021 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl8sw\" (UniqueName: \"kubernetes.io/projected/dc736681-960e-4f76-bc10-25f529da020a-kube-api-access-dl8sw\") pod \"octavia-operator-controller-manager-6687f8d877-42qq8\" (UID: \"dc736681-960e-4f76-bc10-25f529da020a\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.695146 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-662gz\" (UniqueName: \"kubernetes.io/projected/2d60d537-ea47-42fa-94c3-61704aef0678-kube-api-access-662gz\") pod \"neutron-operator-controller-manager-585dbc889-sjvjw\" (UID: \"2d60d537-ea47-42fa-94c3-61704aef0678\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.695194 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqstl\" (UniqueName: \"kubernetes.io/projected/39686eda-a258-408b-bf9c-7ff7d515ed9d-kube-api-access-rqstl\") pod \"manila-operator-controller-manager-7dd968899f-5x28t\" (UID: \"39686eda-a258-408b-bf9c-7ff7d515ed9d\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.695251 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fnk4\" (UniqueName: \"kubernetes.io/projected/1f3087b4-acf0-4a27-9696-bdfb4728e96c-kube-api-access-8fnk4\") pod \"nova-operator-controller-manager-55bff696bd-6z258\" (UID: \"1f3087b4-acf0-4a27-9696-bdfb4728e96c\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.695323 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snq4r\" (UniqueName: \"kubernetes.io/projected/1c4864d3-2fdd-4b98-ac89-aefb49b56187-kube-api-access-snq4r\") pod \"mariadb-operator-controller-manager-67bf948998-ct6hc\" (UID: \"1c4864d3-2fdd-4b98-ac89-aefb49b56187\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.701963 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.703528 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.706876 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-bx7js" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.730743 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.732607 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.735550 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bczqd" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.743715 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqstl\" (UniqueName: \"kubernetes.io/projected/39686eda-a258-408b-bf9c-7ff7d515ed9d-kube-api-access-rqstl\") pod \"manila-operator-controller-manager-7dd968899f-5x28t\" (UID: \"39686eda-a258-408b-bf9c-7ff7d515ed9d\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.753372 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.778877 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.798293 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fnk4\" (UniqueName: \"kubernetes.io/projected/1f3087b4-acf0-4a27-9696-bdfb4728e96c-kube-api-access-8fnk4\") pod \"nova-operator-controller-manager-55bff696bd-6z258\" (UID: \"1f3087b4-acf0-4a27-9696-bdfb4728e96c\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.798377 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chbfv\" (UniqueName: \"kubernetes.io/projected/ae636942-3520-410e-b70a-b4fc19a527ca-kube-api-access-chbfv\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.798408 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snq4r\" (UniqueName: \"kubernetes.io/projected/1c4864d3-2fdd-4b98-ac89-aefb49b56187-kube-api-access-snq4r\") pod \"mariadb-operator-controller-manager-67bf948998-ct6hc\" (UID: \"1c4864d3-2fdd-4b98-ac89-aefb49b56187\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.798643 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl8sw\" (UniqueName: \"kubernetes.io/projected/dc736681-960e-4f76-bc10-25f529da020a-kube-api-access-dl8sw\") pod \"octavia-operator-controller-manager-6687f8d877-42qq8\" (UID: \"dc736681-960e-4f76-bc10-25f529da020a\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.798746 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.798888 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-662gz\" (UniqueName: \"kubernetes.io/projected/2d60d537-ea47-42fa-94c3-61704aef0678-kube-api-access-662gz\") pod \"neutron-operator-controller-manager-585dbc889-sjvjw\" (UID: \"2d60d537-ea47-42fa-94c3-61704aef0678\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.819638 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.824561 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl8sw\" (UniqueName: \"kubernetes.io/projected/dc736681-960e-4f76-bc10-25f529da020a-kube-api-access-dl8sw\") pod \"octavia-operator-controller-manager-6687f8d877-42qq8\" (UID: \"dc736681-960e-4f76-bc10-25f529da020a\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.838658 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fnk4\" (UniqueName: \"kubernetes.io/projected/1f3087b4-acf0-4a27-9696-bdfb4728e96c-kube-api-access-8fnk4\") pod \"nova-operator-controller-manager-55bff696bd-6z258\" (UID: \"1f3087b4-acf0-4a27-9696-bdfb4728e96c\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.841461 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-662gz\" (UniqueName: \"kubernetes.io/projected/2d60d537-ea47-42fa-94c3-61704aef0678-kube-api-access-662gz\") pod \"neutron-operator-controller-manager-585dbc889-sjvjw\" (UID: \"2d60d537-ea47-42fa-94c3-61704aef0678\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.850662 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.859641 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snq4r\" (UniqueName: \"kubernetes.io/projected/1c4864d3-2fdd-4b98-ac89-aefb49b56187-kube-api-access-snq4r\") pod \"mariadb-operator-controller-manager-67bf948998-ct6hc\" (UID: \"1c4864d3-2fdd-4b98-ac89-aefb49b56187\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.876404 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.889365 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.912976 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chbfv\" (UniqueName: \"kubernetes.io/projected/ae636942-3520-410e-b70a-b4fc19a527ca-kube-api-access-chbfv\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.913086 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dqmh\" (UniqueName: \"kubernetes.io/projected/ed67384c-22d3-4466-8990-744b122efbf4-kube-api-access-6dqmh\") pod \"placement-operator-controller-manager-5b964cf4cd-kqdjm\" (UID: \"ed67384c-22d3-4466-8990-744b122efbf4\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.913127 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.913193 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2svrc\" (UniqueName: \"kubernetes.io/projected/6b33adce-a49a-4ce2-af29-412661aaf062-kube-api-access-2svrc\") pod \"ovn-operator-controller-manager-788c46999f-rzjts\" (UID: \"6b33adce-a49a-4ce2-af29-412661aaf062\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" Feb 02 13:20:14 crc kubenswrapper[4721]: E0202 13:20:14.913572 4721 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:14 crc kubenswrapper[4721]: E0202 13:20:14.913613 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert podName:ae636942-3520-410e-b70a-b4fc19a527ca nodeName:}" failed. No retries permitted until 2026-02-02 13:20:15.413597029 +0000 UTC m=+1155.716111418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" (UID: "ae636942-3520-410e-b70a-b4fc19a527ca") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.927683 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.945866 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chbfv\" (UniqueName: \"kubernetes.io/projected/ae636942-3520-410e-b70a-b4fc19a527ca-kube-api-access-chbfv\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.955193 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.956605 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.961144 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zxjhw" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.962429 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv"] Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.964034 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" Feb 02 13:20:14 crc kubenswrapper[4721]: I0202 13:20:14.983786 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.001924 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.007924 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.009252 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.012582 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vk4zk" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.014604 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dqmh\" (UniqueName: \"kubernetes.io/projected/ed67384c-22d3-4466-8990-744b122efbf4-kube-api-access-6dqmh\") pod \"placement-operator-controller-manager-5b964cf4cd-kqdjm\" (UID: \"ed67384c-22d3-4466-8990-744b122efbf4\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.014724 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2svrc\" (UniqueName: \"kubernetes.io/projected/6b33adce-a49a-4ce2-af29-412661aaf062-kube-api-access-2svrc\") pod \"ovn-operator-controller-manager-788c46999f-rzjts\" (UID: \"6b33adce-a49a-4ce2-af29-412661aaf062\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.036576 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.037616 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2svrc\" (UniqueName: \"kubernetes.io/projected/6b33adce-a49a-4ce2-af29-412661aaf062-kube-api-access-2svrc\") pod \"ovn-operator-controller-manager-788c46999f-rzjts\" (UID: \"6b33adce-a49a-4ce2-af29-412661aaf062\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.037778 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.043378 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dqmh\" (UniqueName: \"kubernetes.io/projected/ed67384c-22d3-4466-8990-744b122efbf4-kube-api-access-6dqmh\") pod \"placement-operator-controller-manager-5b964cf4cd-kqdjm\" (UID: \"ed67384c-22d3-4466-8990-744b122efbf4\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.055966 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.057196 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.058610 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.061253 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hdr94" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.065681 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.099186 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-4pk6v"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.100292 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.104282 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-8sckr" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.113541 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.114275 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-4pk6v"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.115729 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px5rv\" (UniqueName: \"kubernetes.io/projected/79e5221b-04ee-496d-82b7-16af5b340595-kube-api-access-px5rv\") pod \"telemetry-operator-controller-manager-5b9ffd7d65-rgkhb\" (UID: \"79e5221b-04ee-496d-82b7-16af5b340595\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.115819 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg56x\" (UniqueName: \"kubernetes.io/projected/499ca4ef-3867-407b-ab4a-64fff307e296-kube-api-access-tg56x\") pod \"swift-operator-controller-manager-68fc8c869-79zrv\" (UID: \"499ca4ef-3867-407b-ab4a-64fff307e296\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.115857 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:15 crc kubenswrapper[4721]: E0202 13:20:15.115953 4721 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:15 crc kubenswrapper[4721]: E0202 13:20:15.116010 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert podName:9d11c3e4-10b4-4ff4-aaa2-04e342d984b4 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:16.115995633 +0000 UTC m=+1156.418510022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert") pod "infra-operator-controller-manager-79955696d6-hktcl" (UID: "9d11c3e4-10b4-4ff4-aaa2-04e342d984b4") : secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.131415 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.228339 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w294l\" (UniqueName: \"kubernetes.io/projected/60ff9309-fd37-4618-b4f0-38704a558ec0-kube-api-access-w294l\") pod \"test-operator-controller-manager-56f8bfcd9f-2828d\" (UID: \"60ff9309-fd37-4618-b4f0-38704a558ec0\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.228429 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg56x\" (UniqueName: \"kubernetes.io/projected/499ca4ef-3867-407b-ab4a-64fff307e296-kube-api-access-tg56x\") pod \"swift-operator-controller-manager-68fc8c869-79zrv\" (UID: \"499ca4ef-3867-407b-ab4a-64fff307e296\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.228528 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrbbc\" (UniqueName: \"kubernetes.io/projected/058f996d-8009-4f83-864d-177f7b577cf0-kube-api-access-hrbbc\") pod \"watcher-operator-controller-manager-564965969-4pk6v\" (UID: \"058f996d-8009-4f83-864d-177f7b577cf0\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.228664 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px5rv\" (UniqueName: \"kubernetes.io/projected/79e5221b-04ee-496d-82b7-16af5b340595-kube-api-access-px5rv\") pod \"telemetry-operator-controller-manager-5b9ffd7d65-rgkhb\" (UID: \"79e5221b-04ee-496d-82b7-16af5b340595\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.256110 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px5rv\" (UniqueName: \"kubernetes.io/projected/79e5221b-04ee-496d-82b7-16af5b340595-kube-api-access-px5rv\") pod \"telemetry-operator-controller-manager-5b9ffd7d65-rgkhb\" (UID: \"79e5221b-04ee-496d-82b7-16af5b340595\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.293437 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.294559 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.324148 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.324402 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-njf7c" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.332857 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.339534 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.346695 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg56x\" (UniqueName: \"kubernetes.io/projected/499ca4ef-3867-407b-ab4a-64fff307e296-kube-api-access-tg56x\") pod \"swift-operator-controller-manager-68fc8c869-79zrv\" (UID: \"499ca4ef-3867-407b-ab4a-64fff307e296\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.348511 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrbbc\" (UniqueName: \"kubernetes.io/projected/058f996d-8009-4f83-864d-177f7b577cf0-kube-api-access-hrbbc\") pod \"watcher-operator-controller-manager-564965969-4pk6v\" (UID: \"058f996d-8009-4f83-864d-177f7b577cf0\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.370763 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w294l\" (UniqueName: \"kubernetes.io/projected/60ff9309-fd37-4618-b4f0-38704a558ec0-kube-api-access-w294l\") pod \"test-operator-controller-manager-56f8bfcd9f-2828d\" (UID: \"60ff9309-fd37-4618-b4f0-38704a558ec0\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.355845 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.410473 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w294l\" (UniqueName: \"kubernetes.io/projected/60ff9309-fd37-4618-b4f0-38704a558ec0-kube-api-access-w294l\") pod \"test-operator-controller-manager-56f8bfcd9f-2828d\" (UID: \"60ff9309-fd37-4618-b4f0-38704a558ec0\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.445274 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" event={"ID":"0562e590-1a66-4fbc-862d-833bc1600eac","Type":"ContainerStarted","Data":"802ab0de64a113538a1aec3f496bfaf10c4f0a0553788be15606e0eb9778f801"} Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.455040 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrbbc\" (UniqueName: \"kubernetes.io/projected/058f996d-8009-4f83-864d-177f7b577cf0-kube-api-access-hrbbc\") pod \"watcher-operator-controller-manager-564965969-4pk6v\" (UID: \"058f996d-8009-4f83-864d-177f7b577cf0\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.465324 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.479270 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.479480 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6c5\" (UniqueName: \"kubernetes.io/projected/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-kube-api-access-cw6c5\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.479545 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.479608 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:15 crc kubenswrapper[4721]: E0202 13:20:15.480970 4721 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:15 crc kubenswrapper[4721]: E0202 13:20:15.481012 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert podName:ae636942-3520-410e-b70a-b4fc19a527ca nodeName:}" failed. No retries permitted until 2026-02-02 13:20:16.480998445 +0000 UTC m=+1156.783512834 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" (UID: "ae636942-3520-410e-b70a-b4fc19a527ca") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.483032 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.484756 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.502482 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gtzwh" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.503806 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.581134 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:15 crc kubenswrapper[4721]: E0202 13:20:15.583742 4721 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 13:20:15 crc kubenswrapper[4721]: E0202 13:20:15.583877 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:16.083858282 +0000 UTC m=+1156.386372671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "metrics-server-cert" not found Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.588175 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6c5\" (UniqueName: \"kubernetes.io/projected/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-kube-api-access-cw6c5\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.588358 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:15 crc kubenswrapper[4721]: E0202 13:20:15.589375 4721 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 13:20:15 crc kubenswrapper[4721]: E0202 13:20:15.589415 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:16.089400282 +0000 UTC m=+1156.391914671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "webhook-server-cert" not found Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.622792 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6c5\" (UniqueName: \"kubernetes.io/projected/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-kube-api-access-cw6c5\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.633474 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.692641 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hb97\" (UniqueName: \"kubernetes.io/projected/56b67b2b-b9fd-4353-88e3-d4f1d44653e2-kube-api-access-5hb97\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xwkhz\" (UID: \"56b67b2b-b9fd-4353-88e3-d4f1d44653e2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.693139 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.723655 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.798198 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hb97\" (UniqueName: \"kubernetes.io/projected/56b67b2b-b9fd-4353-88e3-d4f1d44653e2-kube-api-access-5hb97\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xwkhz\" (UID: \"56b67b2b-b9fd-4353-88e3-d4f1d44653e2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.798815 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.820665 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hb97\" (UniqueName: \"kubernetes.io/projected/56b67b2b-b9fd-4353-88e3-d4f1d44653e2-kube-api-access-5hb97\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xwkhz\" (UID: \"56b67b2b-b9fd-4353-88e3-d4f1d44653e2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.921373 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm"] Feb 02 13:20:15 crc kubenswrapper[4721]: I0202 13:20:15.937173 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5"] Feb 02 13:20:15 crc kubenswrapper[4721]: W0202 13:20:15.996642 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda13f2341_6b53_4a7b_b67a_4a1d1846805d.slice/crio-81ebee5ebf47d1183088f5a241c61e8495e7bbfda07a21c496c5570593db4052 WatchSource:0}: Error finding container 81ebee5ebf47d1183088f5a241c61e8495e7bbfda07a21c496c5570593db4052: Status 404 returned error can't find the container with id 81ebee5ebf47d1183088f5a241c61e8495e7bbfda07a21c496c5570593db4052 Feb 02 13:20:15 crc kubenswrapper[4721]: W0202 13:20:15.999110 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c1486a5_ee95_4cde_9631_3c7c7aa31ae7.slice/crio-128c7313e2e60d8f33042b20b233aa59f4e6a19d54beffe21ff87896615e3018 WatchSource:0}: Error finding container 128c7313e2e60d8f33042b20b233aa59f4e6a19d54beffe21ff87896615e3018: Status 404 returned error can't find the container with id 128c7313e2e60d8f33042b20b233aa59f4e6a19d54beffe21ff87896615e3018 Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.072679 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.115480 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.115919 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:16 crc kubenswrapper[4721]: E0202 13:20:16.116290 4721 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 13:20:16 crc kubenswrapper[4721]: E0202 13:20:16.116356 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:17.116336276 +0000 UTC m=+1157.418850675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "metrics-server-cert" not found Feb 02 13:20:16 crc kubenswrapper[4721]: E0202 13:20:16.116825 4721 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 13:20:16 crc kubenswrapper[4721]: E0202 13:20:16.116904 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:17.116881812 +0000 UTC m=+1157.419396261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "webhook-server-cert" not found Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.217445 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:16 crc kubenswrapper[4721]: E0202 13:20:16.217848 4721 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:16 crc kubenswrapper[4721]: E0202 13:20:16.217905 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert podName:9d11c3e4-10b4-4ff4-aaa2-04e342d984b4 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:18.217887198 +0000 UTC m=+1158.520401597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert") pod "infra-operator-controller-manager-79955696d6-hktcl" (UID: "9d11c3e4-10b4-4ff4-aaa2-04e342d984b4") : secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.442100 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st"] Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.462635 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" event={"ID":"a13f2341-6b53-4a7b-b67a-4a1d1846805d","Type":"ContainerStarted","Data":"81ebee5ebf47d1183088f5a241c61e8495e7bbfda07a21c496c5570593db4052"} Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.465311 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" event={"ID":"20f771bf-d003-48b0-8e50-0d1217f24b45","Type":"ContainerStarted","Data":"3c34b8a7ad1f0e42c2b1a6bcea58158d7345e04ba4caaf5bedeae2a64c01764e"} Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.471981 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" event={"ID":"0c1486a5-ee95-4cde-9631-3c7c7aa31ae7","Type":"ContainerStarted","Data":"128c7313e2e60d8f33042b20b233aa59f4e6a19d54beffe21ff87896615e3018"} Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.477308 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" event={"ID":"23be57b1-6b3e-4346-93f9-2c45b0562d2b","Type":"ContainerStarted","Data":"427142c2cd53668b37f223b38bb98c04a1e3ac386aa42354cf12f771c1bfff47"} Feb 02 13:20:16 crc kubenswrapper[4721]: I0202 13:20:16.528031 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:16 crc kubenswrapper[4721]: E0202 13:20:16.528274 4721 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:16 crc kubenswrapper[4721]: E0202 13:20:16.528468 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert podName:ae636942-3520-410e-b70a-b4fc19a527ca nodeName:}" failed. No retries permitted until 2026-02-02 13:20:18.528315877 +0000 UTC m=+1158.830830266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" (UID: "ae636942-3520-410e-b70a-b4fc19a527ca") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.145212 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.145754 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:17 crc kubenswrapper[4721]: E0202 13:20:17.145655 4721 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 13:20:17 crc kubenswrapper[4721]: E0202 13:20:17.146020 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:19.146003022 +0000 UTC m=+1159.448517411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "webhook-server-cert" not found Feb 02 13:20:17 crc kubenswrapper[4721]: E0202 13:20:17.145966 4721 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 13:20:17 crc kubenswrapper[4721]: E0202 13:20:17.146396 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:19.146367522 +0000 UTC m=+1159.448881901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "metrics-server-cert" not found Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.523169 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.539681 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.554838 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw"] Feb 02 13:20:17 crc kubenswrapper[4721]: W0202 13:20:17.555812 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39686eda_a258_408b_bf9c_7ff7d515ed9d.slice/crio-e8dde490ec5376d7612923a9e2bfd3282bf5e1664bd29b15db28d5ef64d20142 WatchSource:0}: Error finding container e8dde490ec5376d7612923a9e2bfd3282bf5e1664bd29b15db28d5ef64d20142: Status 404 returned error can't find the container with id e8dde490ec5376d7612923a9e2bfd3282bf5e1664bd29b15db28d5ef64d20142 Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.580427 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-4pk6v"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.629801 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.686445 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.714882 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.743839 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.758310 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.776392 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-6z258"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.788175 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc"] Feb 02 13:20:17 crc kubenswrapper[4721]: I0202 13:20:17.801218 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d"] Feb 02 13:20:17 crc kubenswrapper[4721]: W0202 13:20:17.946219 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c4864d3_2fdd_4b98_ac89_aefb49b56187.slice/crio-faa5a566441e6886f18327166fefb1254f417081b69cbf6d4b69c62f40c66281 WatchSource:0}: Error finding container faa5a566441e6886f18327166fefb1254f417081b69cbf6d4b69c62f40c66281: Status 404 returned error can't find the container with id faa5a566441e6886f18327166fefb1254f417081b69cbf6d4b69c62f40c66281 Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.055022 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm"] Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.075453 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv"] Feb 02 13:20:18 crc kubenswrapper[4721]: W0202 13:20:18.109197 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod499ca4ef_3867_407b_ab4a_64fff307e296.slice/crio-f70c073e7f283813d0de1a9ae64f36de9305b2d8727e696f1cdfe5b7d76e2f86 WatchSource:0}: Error finding container f70c073e7f283813d0de1a9ae64f36de9305b2d8727e696f1cdfe5b7d76e2f86: Status 404 returned error can't find the container with id f70c073e7f283813d0de1a9ae64f36de9305b2d8727e696f1cdfe5b7d76e2f86 Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.117446 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz"] Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.133350 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6dqmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-kqdjm_openstack-operators(ed67384c-22d3-4466-8990-744b122efbf4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.133376 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tg56x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-79zrv_openstack-operators(499ca4ef-3867-407b-ab4a-64fff307e296): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.135214 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" podUID="499ca4ef-3867-407b-ab4a-64fff307e296" Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.135299 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" podUID="ed67384c-22d3-4466-8990-744b122efbf4" Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.156330 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hb97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xwkhz_openstack-operators(56b67b2b-b9fd-4353-88e3-d4f1d44653e2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.158000 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" podUID="56b67b2b-b9fd-4353-88e3-d4f1d44653e2" Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.289751 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.289907 4721 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.289952 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert podName:9d11c3e4-10b4-4ff4-aaa2-04e342d984b4 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:22.289938682 +0000 UTC m=+1162.592453071 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert") pod "infra-operator-controller-manager-79955696d6-hktcl" (UID: "9d11c3e4-10b4-4ff4-aaa2-04e342d984b4") : secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.552530 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" event={"ID":"60ff9309-fd37-4618-b4f0-38704a558ec0","Type":"ContainerStarted","Data":"993df692252a4ead9c18075ce5c8b9773f466e31a544109436477462dc5cee88"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.554468 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" event={"ID":"6b33adce-a49a-4ce2-af29-412661aaf062","Type":"ContainerStarted","Data":"74eaba4e0d9d2120b42db269dfff77ce3f4a8765d56c7b00681e8546918b6163"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.577449 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" event={"ID":"39686eda-a258-408b-bf9c-7ff7d515ed9d","Type":"ContainerStarted","Data":"e8dde490ec5376d7612923a9e2bfd3282bf5e1664bd29b15db28d5ef64d20142"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.580478 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" event={"ID":"e5a04e0d-8a73-4f21-a61d-374d7a5784fb","Type":"ContainerStarted","Data":"7c365d4bc9b09ec8665498b5848df8310ec99e5b9dd74ac183527e17c8233b3b"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.583997 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" event={"ID":"499ca4ef-3867-407b-ab4a-64fff307e296","Type":"ContainerStarted","Data":"f70c073e7f283813d0de1a9ae64f36de9305b2d8727e696f1cdfe5b7d76e2f86"} Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.587243 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" podUID="499ca4ef-3867-407b-ab4a-64fff307e296" Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.588111 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" event={"ID":"56b67b2b-b9fd-4353-88e3-d4f1d44653e2","Type":"ContainerStarted","Data":"5b53dd51eed45826864bf9b471fc9ecb9f967352eb8091998745e303c7f858a8"} Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.590454 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" podUID="56b67b2b-b9fd-4353-88e3-d4f1d44653e2" Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.591712 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" event={"ID":"dc736681-960e-4f76-bc10-25f529da020a","Type":"ContainerStarted","Data":"24d34dd8e5d15be47a504e2775a2ba6965c6f08fdf2ca9c5884758b3a3d540c4"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.593289 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" event={"ID":"058f996d-8009-4f83-864d-177f7b577cf0","Type":"ContainerStarted","Data":"0983ba074312bb2dd88f71b118cf42ed8067e78c7f205037c928c167dbb665aa"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.597902 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" event={"ID":"b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa","Type":"ContainerStarted","Data":"471947c38644f35ad2bc7655ff0343d21222d75dcd4f049db3926a1c3440b6fe"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.600286 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" event={"ID":"8a86dacc-de73-4b52-994c-3b089ee427cc","Type":"ContainerStarted","Data":"e439600c106ad9bbc89a92a91898ba0e11c3de77adca54f6e00afd1e4adc73f8"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.602760 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" event={"ID":"79e5221b-04ee-496d-82b7-16af5b340595","Type":"ContainerStarted","Data":"7a1a23906b68b2275b2b93abc3243a639322b6f3cac84fea671ba99d73325348"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.604463 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.604645 4721 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.604708 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert podName:ae636942-3520-410e-b70a-b4fc19a527ca nodeName:}" failed. No retries permitted until 2026-02-02 13:20:22.604688638 +0000 UTC m=+1162.907203027 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" (UID: "ae636942-3520-410e-b70a-b4fc19a527ca") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.609362 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" event={"ID":"ed67384c-22d3-4466-8990-744b122efbf4","Type":"ContainerStarted","Data":"6ea4d412214ebd4cf5a5edc453c2a6f8e11bf53bf63bc2293ab350ba29e245ab"} Feb 02 13:20:18 crc kubenswrapper[4721]: E0202 13:20:18.609885 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" podUID="ed67384c-22d3-4466-8990-744b122efbf4" Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.611705 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" event={"ID":"1c4864d3-2fdd-4b98-ac89-aefb49b56187","Type":"ContainerStarted","Data":"faa5a566441e6886f18327166fefb1254f417081b69cbf6d4b69c62f40c66281"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.613616 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" event={"ID":"2d60d537-ea47-42fa-94c3-61704aef0678","Type":"ContainerStarted","Data":"2b420fd5ec1c7a11931f50ccd0e0b5f7e9854fbffa2f1cd0f377125a53ff6b3d"} Feb 02 13:20:18 crc kubenswrapper[4721]: I0202 13:20:18.634721 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" event={"ID":"1f3087b4-acf0-4a27-9696-bdfb4728e96c","Type":"ContainerStarted","Data":"17f2e2e8004c70072aacb28682f3183d6d7f698c0b3aabebabe8de85b22a96f3"} Feb 02 13:20:19 crc kubenswrapper[4721]: I0202 13:20:19.219095 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:19 crc kubenswrapper[4721]: I0202 13:20:19.219214 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:19 crc kubenswrapper[4721]: E0202 13:20:19.219252 4721 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 13:20:19 crc kubenswrapper[4721]: E0202 13:20:19.219317 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:23.219298809 +0000 UTC m=+1163.521813198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "webhook-server-cert" not found Feb 02 13:20:19 crc kubenswrapper[4721]: E0202 13:20:19.219378 4721 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 13:20:19 crc kubenswrapper[4721]: E0202 13:20:19.219418 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:23.219407172 +0000 UTC m=+1163.521921561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "metrics-server-cert" not found Feb 02 13:20:19 crc kubenswrapper[4721]: E0202 13:20:19.657877 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" podUID="499ca4ef-3867-407b-ab4a-64fff307e296" Feb 02 13:20:19 crc kubenswrapper[4721]: E0202 13:20:19.657886 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" podUID="56b67b2b-b9fd-4353-88e3-d4f1d44653e2" Feb 02 13:20:19 crc kubenswrapper[4721]: E0202 13:20:19.677930 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" podUID="ed67384c-22d3-4466-8990-744b122efbf4" Feb 02 13:20:22 crc kubenswrapper[4721]: I0202 13:20:22.301254 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:22 crc kubenswrapper[4721]: E0202 13:20:22.301406 4721 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:22 crc kubenswrapper[4721]: E0202 13:20:22.301975 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert podName:9d11c3e4-10b4-4ff4-aaa2-04e342d984b4 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:30.301952625 +0000 UTC m=+1170.604467014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert") pod "infra-operator-controller-manager-79955696d6-hktcl" (UID: "9d11c3e4-10b4-4ff4-aaa2-04e342d984b4") : secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:22 crc kubenswrapper[4721]: I0202 13:20:22.605901 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:22 crc kubenswrapper[4721]: E0202 13:20:22.606338 4721 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:22 crc kubenswrapper[4721]: E0202 13:20:22.606646 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert podName:ae636942-3520-410e-b70a-b4fc19a527ca nodeName:}" failed. No retries permitted until 2026-02-02 13:20:30.606623819 +0000 UTC m=+1170.909138208 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" (UID: "ae636942-3520-410e-b70a-b4fc19a527ca") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:23 crc kubenswrapper[4721]: I0202 13:20:23.317224 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:23 crc kubenswrapper[4721]: I0202 13:20:23.317286 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:23 crc kubenswrapper[4721]: E0202 13:20:23.317359 4721 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 13:20:23 crc kubenswrapper[4721]: E0202 13:20:23.317393 4721 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 13:20:23 crc kubenswrapper[4721]: E0202 13:20:23.317428 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:31.317414466 +0000 UTC m=+1171.619928855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "metrics-server-cert" not found Feb 02 13:20:23 crc kubenswrapper[4721]: E0202 13:20:23.317440 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:31.317435157 +0000 UTC m=+1171.619949546 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "webhook-server-cert" not found Feb 02 13:20:30 crc kubenswrapper[4721]: I0202 13:20:30.376917 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:30 crc kubenswrapper[4721]: E0202 13:20:30.377608 4721 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:30 crc kubenswrapper[4721]: E0202 13:20:30.377658 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert podName:9d11c3e4-10b4-4ff4-aaa2-04e342d984b4 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:46.377643824 +0000 UTC m=+1186.680158213 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert") pod "infra-operator-controller-manager-79955696d6-hktcl" (UID: "9d11c3e4-10b4-4ff4-aaa2-04e342d984b4") : secret "infra-operator-webhook-server-cert" not found Feb 02 13:20:30 crc kubenswrapper[4721]: I0202 13:20:30.682524 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:30 crc kubenswrapper[4721]: E0202 13:20:30.682718 4721 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:30 crc kubenswrapper[4721]: E0202 13:20:30.683013 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert podName:ae636942-3520-410e-b70a-b4fc19a527ca nodeName:}" failed. No retries permitted until 2026-02-02 13:20:46.682968697 +0000 UTC m=+1186.985483076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" (UID: "ae636942-3520-410e-b70a-b4fc19a527ca") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 13:20:30 crc kubenswrapper[4721]: E0202 13:20:30.887419 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521" Feb 02 13:20:30 crc kubenswrapper[4721]: E0202 13:20:30.887677 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zsmbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5f4b8bd54d-qqpfm_openstack-operators(a13f2341-6b53-4a7b-b67a-4a1d1846805d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:30 crc kubenswrapper[4721]: E0202 13:20:30.888893 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" podUID="a13f2341-6b53-4a7b-b67a-4a1d1846805d" Feb 02 13:20:31 crc kubenswrapper[4721]: I0202 13:20:31.395596 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:31 crc kubenswrapper[4721]: I0202 13:20:31.395697 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:31 crc kubenswrapper[4721]: E0202 13:20:31.396881 4721 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 13:20:31 crc kubenswrapper[4721]: E0202 13:20:31.396958 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs podName:55bc1d80-1d29-4e15-baca-49eee6fd3aa5 nodeName:}" failed. No retries permitted until 2026-02-02 13:20:47.396936 +0000 UTC m=+1187.699450469 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs") pod "openstack-operator-controller-manager-5d6c59fb84-5s25p" (UID: "55bc1d80-1d29-4e15-baca-49eee6fd3aa5") : secret "webhook-server-cert" not found Feb 02 13:20:31 crc kubenswrapper[4721]: I0202 13:20:31.402498 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-metrics-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:31 crc kubenswrapper[4721]: E0202 13:20:31.762708 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" podUID="a13f2341-6b53-4a7b-b67a-4a1d1846805d" Feb 02 13:20:31 crc kubenswrapper[4721]: E0202 13:20:31.813172 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:1f593e8d49d02b6484c89632192ae54771675c54fbd8426e3675b8e20ecfd7c4" Feb 02 13:20:31 crc kubenswrapper[4721]: E0202 13:20:31.813377 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:1f593e8d49d02b6484c89632192ae54771675c54fbd8426e3675b8e20ecfd7c4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p959t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-8886f4c47-q5lbf_openstack-operators(20f771bf-d003-48b0-8e50-0d1217f24b45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:31 crc kubenswrapper[4721]: E0202 13:20:31.814517 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" podUID="20f771bf-d003-48b0-8e50-0d1217f24b45" Feb 02 13:20:32 crc kubenswrapper[4721]: E0202 13:20:32.769970 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:1f593e8d49d02b6484c89632192ae54771675c54fbd8426e3675b8e20ecfd7c4\\\"\"" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" podUID="20f771bf-d003-48b0-8e50-0d1217f24b45" Feb 02 13:20:33 crc kubenswrapper[4721]: E0202 13:20:33.742040 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6" Feb 02 13:20:33 crc kubenswrapper[4721]: E0202 13:20:33.742319 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-662gz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-585dbc889-sjvjw_openstack-operators(2d60d537-ea47-42fa-94c3-61704aef0678): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:33 crc kubenswrapper[4721]: E0202 13:20:33.743698 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" podUID="2d60d537-ea47-42fa-94c3-61704aef0678" Feb 02 13:20:33 crc kubenswrapper[4721]: E0202 13:20:33.776205 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" podUID="2d60d537-ea47-42fa-94c3-61704aef0678" Feb 02 13:20:34 crc kubenswrapper[4721]: E0202 13:20:34.439298 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c" Feb 02 13:20:34 crc kubenswrapper[4721]: E0202 13:20:34.439771 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jnfsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7b6c4d8c5f-8zlv5_openstack-operators(0c1486a5-ee95-4cde-9631-3c7c7aa31ae7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:34 crc kubenswrapper[4721]: E0202 13:20:34.440982 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" podUID="0c1486a5-ee95-4cde-9631-3c7c7aa31ae7" Feb 02 13:20:34 crc kubenswrapper[4721]: E0202 13:20:34.784863 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:379470e2752f286e73908e94233e884922b231169a5521a59f53843a2dc3184c\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" podUID="0c1486a5-ee95-4cde-9631-3c7c7aa31ae7" Feb 02 13:20:36 crc kubenswrapper[4721]: E0202 13:20:36.597413 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b" Feb 02 13:20:36 crc kubenswrapper[4721]: E0202 13:20:36.597852 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hrbbc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-4pk6v_openstack-operators(058f996d-8009-4f83-864d-177f7b577cf0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:36 crc kubenswrapper[4721]: E0202 13:20:36.599042 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" podUID="058f996d-8009-4f83-864d-177f7b577cf0" Feb 02 13:20:36 crc kubenswrapper[4721]: E0202 13:20:36.801961 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" podUID="058f996d-8009-4f83-864d-177f7b577cf0" Feb 02 13:20:37 crc kubenswrapper[4721]: E0202 13:20:37.306728 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566" Feb 02 13:20:37 crc kubenswrapper[4721]: E0202 13:20:37.306915 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rqstl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7dd968899f-5x28t_openstack-operators(39686eda-a258-408b-bf9c-7ff7d515ed9d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:37 crc kubenswrapper[4721]: E0202 13:20:37.309531 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" podUID="39686eda-a258-408b-bf9c-7ff7d515ed9d" Feb 02 13:20:37 crc kubenswrapper[4721]: E0202 13:20:37.810057 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" podUID="39686eda-a258-408b-bf9c-7ff7d515ed9d" Feb 02 13:20:38 crc kubenswrapper[4721]: E0202 13:20:38.653114 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf" Feb 02 13:20:38 crc kubenswrapper[4721]: E0202 13:20:38.653601 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-snq4r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf948998-ct6hc_openstack-operators(1c4864d3-2fdd-4b98-ac89-aefb49b56187): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:38 crc kubenswrapper[4721]: E0202 13:20:38.655303 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" podUID="1c4864d3-2fdd-4b98-ac89-aefb49b56187" Feb 02 13:20:38 crc kubenswrapper[4721]: E0202 13:20:38.817167 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" podUID="1c4864d3-2fdd-4b98-ac89-aefb49b56187" Feb 02 13:20:39 crc kubenswrapper[4721]: E0202 13:20:39.264890 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241" Feb 02 13:20:39 crc kubenswrapper[4721]: E0202 13:20:39.265054 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w294l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-2828d_openstack-operators(60ff9309-fd37-4618-b4f0-38704a558ec0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:39 crc kubenswrapper[4721]: E0202 13:20:39.266469 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" podUID="60ff9309-fd37-4618-b4f0-38704a558ec0" Feb 02 13:20:39 crc kubenswrapper[4721]: E0202 13:20:39.825542 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" podUID="60ff9309-fd37-4618-b4f0-38704a558ec0" Feb 02 13:20:46 crc kubenswrapper[4721]: E0202 13:20:46.137906 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17" Feb 02 13:20:46 crc kubenswrapper[4721]: E0202 13:20:46.138916 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d9qz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-5vbh8_openstack-operators(b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:46 crc kubenswrapper[4721]: E0202 13:20:46.140368 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" podUID="b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa" Feb 02 13:20:46 crc kubenswrapper[4721]: I0202 13:20:46.457635 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:46 crc kubenswrapper[4721]: I0202 13:20:46.465462 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d11c3e4-10b4-4ff4-aaa2-04e342d984b4-cert\") pod \"infra-operator-controller-manager-79955696d6-hktcl\" (UID: \"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:46 crc kubenswrapper[4721]: I0202 13:20:46.706931 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:46 crc kubenswrapper[4721]: E0202 13:20:46.745214 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e" Feb 02 13:20:46 crc kubenswrapper[4721]: E0202 13:20:46.745385 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8fnk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-6z258_openstack-operators(1f3087b4-acf0-4a27-9696-bdfb4728e96c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:20:46 crc kubenswrapper[4721]: E0202 13:20:46.746677 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" podUID="1f3087b4-acf0-4a27-9696-bdfb4728e96c" Feb 02 13:20:46 crc kubenswrapper[4721]: I0202 13:20:46.764204 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:46 crc kubenswrapper[4721]: I0202 13:20:46.769772 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ae636942-3520-410e-b70a-b4fc19a527ca-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw\" (UID: \"ae636942-3520-410e-b70a-b4fc19a527ca\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:46 crc kubenswrapper[4721]: I0202 13:20:46.875653 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:46 crc kubenswrapper[4721]: E0202 13:20:46.901513 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" podUID="1f3087b4-acf0-4a27-9696-bdfb4728e96c" Feb 02 13:20:46 crc kubenswrapper[4721]: E0202 13:20:46.901796 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" podUID="b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa" Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.476888 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.484505 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55bc1d80-1d29-4e15-baca-49eee6fd3aa5-webhook-certs\") pod \"openstack-operator-controller-manager-5d6c59fb84-5s25p\" (UID: \"55bc1d80-1d29-4e15-baca-49eee6fd3aa5\") " pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.557769 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.865889 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-hktcl"] Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.920055 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" event={"ID":"e5a04e0d-8a73-4f21-a61d-374d7a5784fb","Type":"ContainerStarted","Data":"c0d07b97a688293d76e08988273b5741d49efe11e45772f740883091f139ec0d"} Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.920203 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" Feb 02 13:20:47 crc kubenswrapper[4721]: W0202 13:20:47.933413 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d11c3e4_10b4_4ff4_aaa2_04e342d984b4.slice/crio-cf534ec8ff2c801fb79015aaf251334b66d35d2f6b81f8e3be6a7a9dcf1e5e1f WatchSource:0}: Error finding container cf534ec8ff2c801fb79015aaf251334b66d35d2f6b81f8e3be6a7a9dcf1e5e1f: Status 404 returned error can't find the container with id cf534ec8ff2c801fb79015aaf251334b66d35d2f6b81f8e3be6a7a9dcf1e5e1f Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.933808 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p"] Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.934294 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" event={"ID":"23be57b1-6b3e-4346-93f9-2c45b0562d2b","Type":"ContainerStarted","Data":"4fba65b61bf8ae32cd08011529ace66fa359e9d02d20f9024a34708b8a36fba7"} Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.934442 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.950173 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" podStartSLOduration=5.776487449 podStartE2EDuration="33.950155125s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.946291525 +0000 UTC m=+1158.248805914" lastFinishedPulling="2026-02-02 13:20:46.119959201 +0000 UTC m=+1186.422473590" observedRunningTime="2026-02-02 13:20:47.940962748 +0000 UTC m=+1188.243477197" watchObservedRunningTime="2026-02-02 13:20:47.950155125 +0000 UTC m=+1188.252669514" Feb 02 13:20:47 crc kubenswrapper[4721]: I0202 13:20:47.972553 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" podStartSLOduration=4.28802897 podStartE2EDuration="33.97253573s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:16.433014215 +0000 UTC m=+1156.735528604" lastFinishedPulling="2026-02-02 13:20:46.117520975 +0000 UTC m=+1186.420035364" observedRunningTime="2026-02-02 13:20:47.96922031 +0000 UTC m=+1188.271734699" watchObservedRunningTime="2026-02-02 13:20:47.97253573 +0000 UTC m=+1188.275050129" Feb 02 13:20:48 crc kubenswrapper[4721]: I0202 13:20:48.003023 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" event={"ID":"6b33adce-a49a-4ce2-af29-412661aaf062","Type":"ContainerStarted","Data":"cb526328f69003fa1613ab54133e2b47c03d6a116bd0f2d0d9f76c110b41e91d"} Feb 02 13:20:48 crc kubenswrapper[4721]: I0202 13:20:48.003110 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" Feb 02 13:20:48 crc kubenswrapper[4721]: I0202 13:20:48.007534 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw"] Feb 02 13:20:48 crc kubenswrapper[4721]: I0202 13:20:48.017749 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" event={"ID":"79e5221b-04ee-496d-82b7-16af5b340595","Type":"ContainerStarted","Data":"4eb53c7c6627b8e1c53a26b3100872c36008080df6a06829dd6bae642daa0b7c"} Feb 02 13:20:48 crc kubenswrapper[4721]: I0202 13:20:48.018675 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" Feb 02 13:20:48 crc kubenswrapper[4721]: I0202 13:20:48.023341 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" podStartSLOduration=5.497778497 podStartE2EDuration="34.023325091s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.592983658 +0000 UTC m=+1157.895498047" lastFinishedPulling="2026-02-02 13:20:46.118530252 +0000 UTC m=+1186.421044641" observedRunningTime="2026-02-02 13:20:48.02179544 +0000 UTC m=+1188.324309829" watchObservedRunningTime="2026-02-02 13:20:48.023325091 +0000 UTC m=+1188.325839470" Feb 02 13:20:48 crc kubenswrapper[4721]: I0202 13:20:48.070882 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" podStartSLOduration=4.903179337 podStartE2EDuration="34.070863715s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.557198552 +0000 UTC m=+1157.859712941" lastFinishedPulling="2026-02-02 13:20:46.72488293 +0000 UTC m=+1187.027397319" observedRunningTime="2026-02-02 13:20:48.046001383 +0000 UTC m=+1188.348515782" watchObservedRunningTime="2026-02-02 13:20:48.070863715 +0000 UTC m=+1188.373378104" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.026534 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" event={"ID":"499ca4ef-3867-407b-ab4a-64fff307e296","Type":"ContainerStarted","Data":"f49643613b63bc62f45dc4e728757d15c987d21c43c18bfa6758c390fe1f39ce"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.027357 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.028123 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" event={"ID":"55bc1d80-1d29-4e15-baca-49eee6fd3aa5","Type":"ContainerStarted","Data":"c09e042f26a6c7cec14d57fb8281c8288c7994c6467860aedc668c96157eb6ff"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.028166 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" event={"ID":"55bc1d80-1d29-4e15-baca-49eee6fd3aa5","Type":"ContainerStarted","Data":"0be584c4dfc86e974e8c29adc0d7b52a3cb7525cb15ff3f593b07acafee1f276"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.028499 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.029231 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" event={"ID":"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4","Type":"ContainerStarted","Data":"cf534ec8ff2c801fb79015aaf251334b66d35d2f6b81f8e3be6a7a9dcf1e5e1f"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.030696 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" event={"ID":"2d60d537-ea47-42fa-94c3-61704aef0678","Type":"ContainerStarted","Data":"ea603c5fa5301bac67a200d83e1ea6c4bc07678d5b2cb44a5830807116b184e6"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.030921 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.032852 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" event={"ID":"8a86dacc-de73-4b52-994c-3b089ee427cc","Type":"ContainerStarted","Data":"647c2bb6d09289bca08a4175e39e81c1aee864e548128f2e678bd28e7d8c85b5"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.032878 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.034708 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" event={"ID":"0562e590-1a66-4fbc-862d-833bc1600eac","Type":"ContainerStarted","Data":"387137b53d96a73fa558ca940bfa48d59d3708420598717f17167e4a673e7763"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.034841 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.036604 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" event={"ID":"56b67b2b-b9fd-4353-88e3-d4f1d44653e2","Type":"ContainerStarted","Data":"f9d62505f47aca83675b83ff091313c0aa58d9af4c14cc8c0f02b74cbf4f7e25"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.038265 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" event={"ID":"ae636942-3520-410e-b70a-b4fc19a527ca","Type":"ContainerStarted","Data":"e16577c4382a9568fff440f6e1920728aa5eece0832bdecd4ce93e31660a2801"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.039875 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" event={"ID":"20f771bf-d003-48b0-8e50-0d1217f24b45","Type":"ContainerStarted","Data":"2173a92b978455abce7aba1b85d479f3912404197a459a11a84a16ae19eb40ba"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.040102 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.041326 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" event={"ID":"a13f2341-6b53-4a7b-b67a-4a1d1846805d","Type":"ContainerStarted","Data":"873cb0f86ebab76c9c5fba2b5b0d2c4ba945e1bd27fca2383c85d94aa4f60695"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.041477 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.043549 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" event={"ID":"dc736681-960e-4f76-bc10-25f529da020a","Type":"ContainerStarted","Data":"5763d53b309ea1dd4d4e42205c5751ae55546b72a7bdfd71d412ae9c719a3eda"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.043732 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.044864 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" event={"ID":"058f996d-8009-4f83-864d-177f7b577cf0","Type":"ContainerStarted","Data":"937b1fdbc6508ad95d249a0e96910617ee8f21c89375457abc058c69d9b5af32"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.045025 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.047657 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" event={"ID":"ed67384c-22d3-4466-8990-744b122efbf4","Type":"ContainerStarted","Data":"f5a0aab20ab7f56fabfe6dd5a21a586053f50f3b8c5ed42c3fbe3bf38a7917b5"} Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.091209 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" podStartSLOduration=5.8596063560000005 podStartE2EDuration="35.091193298s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:18.133191841 +0000 UTC m=+1158.435706230" lastFinishedPulling="2026-02-02 13:20:47.364778763 +0000 UTC m=+1187.667293172" observedRunningTime="2026-02-02 13:20:49.087384976 +0000 UTC m=+1189.389899365" watchObservedRunningTime="2026-02-02 13:20:49.091193298 +0000 UTC m=+1189.393707687" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.134803 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" podStartSLOduration=6.794040959 podStartE2EDuration="35.134781254s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.778118586 +0000 UTC m=+1158.080632975" lastFinishedPulling="2026-02-02 13:20:46.118858881 +0000 UTC m=+1186.421373270" observedRunningTime="2026-02-02 13:20:49.117489918 +0000 UTC m=+1189.420004317" watchObservedRunningTime="2026-02-02 13:20:49.134781254 +0000 UTC m=+1189.437295643" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.194852 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" podStartSLOduration=4.815111438 podStartE2EDuration="35.194830115s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.562463084 +0000 UTC m=+1157.864977473" lastFinishedPulling="2026-02-02 13:20:47.942181771 +0000 UTC m=+1188.244696150" observedRunningTime="2026-02-02 13:20:49.161484545 +0000 UTC m=+1189.463998964" watchObservedRunningTime="2026-02-02 13:20:49.194830115 +0000 UTC m=+1189.497344514" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.195136 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" podStartSLOduration=3.808361114 podStartE2EDuration="35.195130544s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:16.02864277 +0000 UTC m=+1156.331157159" lastFinishedPulling="2026-02-02 13:20:47.41541218 +0000 UTC m=+1187.717926589" observedRunningTime="2026-02-02 13:20:49.194419435 +0000 UTC m=+1189.496933844" watchObservedRunningTime="2026-02-02 13:20:49.195130544 +0000 UTC m=+1189.497644933" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.269154 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" podStartSLOduration=6.038181175 podStartE2EDuration="35.269133511s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:18.13316392 +0000 UTC m=+1158.435678309" lastFinishedPulling="2026-02-02 13:20:47.364116256 +0000 UTC m=+1187.666630645" observedRunningTime="2026-02-02 13:20:49.23944458 +0000 UTC m=+1189.541958969" watchObservedRunningTime="2026-02-02 13:20:49.269133511 +0000 UTC m=+1189.571647920" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.279681 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" podStartSLOduration=6.733942886 podStartE2EDuration="35.279664986s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.572181156 +0000 UTC m=+1157.874695545" lastFinishedPulling="2026-02-02 13:20:46.117903256 +0000 UTC m=+1186.420417645" observedRunningTime="2026-02-02 13:20:49.270396135 +0000 UTC m=+1189.572910544" watchObservedRunningTime="2026-02-02 13:20:49.279664986 +0000 UTC m=+1189.582179395" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.346269 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" podStartSLOduration=4.976911317 podStartE2EDuration="35.346248323s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.571797266 +0000 UTC m=+1157.874311645" lastFinishedPulling="2026-02-02 13:20:47.941134262 +0000 UTC m=+1188.243648651" observedRunningTime="2026-02-02 13:20:49.315858813 +0000 UTC m=+1189.618373212" watchObservedRunningTime="2026-02-02 13:20:49.346248323 +0000 UTC m=+1189.648762732" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.351632 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xwkhz" podStartSLOduration=6.106222612 podStartE2EDuration="35.351617048s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:18.156184851 +0000 UTC m=+1158.458699240" lastFinishedPulling="2026-02-02 13:20:47.401579287 +0000 UTC m=+1187.704093676" observedRunningTime="2026-02-02 13:20:49.347438345 +0000 UTC m=+1189.649952744" watchObservedRunningTime="2026-02-02 13:20:49.351617048 +0000 UTC m=+1189.654131427" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.395003 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" podStartSLOduration=4.539814106 podStartE2EDuration="35.394978468s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:15.262454646 +0000 UTC m=+1155.564969035" lastFinishedPulling="2026-02-02 13:20:46.117619008 +0000 UTC m=+1186.420133397" observedRunningTime="2026-02-02 13:20:49.376598102 +0000 UTC m=+1189.679112491" watchObservedRunningTime="2026-02-02 13:20:49.394978468 +0000 UTC m=+1189.697492867" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.482774 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" podStartSLOduration=35.482748548 podStartE2EDuration="35.482748548s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:20:49.482570843 +0000 UTC m=+1189.785085232" watchObservedRunningTime="2026-02-02 13:20:49.482748548 +0000 UTC m=+1189.785262937" Feb 02 13:20:49 crc kubenswrapper[4721]: I0202 13:20:49.483016 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" podStartSLOduration=3.654937091 podStartE2EDuration="35.483007285s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:15.573519784 +0000 UTC m=+1155.876034183" lastFinishedPulling="2026-02-02 13:20:47.401589968 +0000 UTC m=+1187.704104377" observedRunningTime="2026-02-02 13:20:49.427521027 +0000 UTC m=+1189.730035416" watchObservedRunningTime="2026-02-02 13:20:49.483007285 +0000 UTC m=+1189.785521674" Feb 02 13:20:50 crc kubenswrapper[4721]: I0202 13:20:50.060069 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" event={"ID":"0c1486a5-ee95-4cde-9631-3c7c7aa31ae7","Type":"ContainerStarted","Data":"e91d52bb1f7e6c61fa343414482c30f0348d9fe59bf0f15b9460d046775f0008"} Feb 02 13:20:50 crc kubenswrapper[4721]: I0202 13:20:50.113179 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" podStartSLOduration=2.689799218 podStartE2EDuration="36.113162906s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:16.043905922 +0000 UTC m=+1156.346420311" lastFinishedPulling="2026-02-02 13:20:49.46726962 +0000 UTC m=+1189.769783999" observedRunningTime="2026-02-02 13:20:50.107321148 +0000 UTC m=+1190.409835537" watchObservedRunningTime="2026-02-02 13:20:50.113162906 +0000 UTC m=+1190.415677295" Feb 02 13:20:52 crc kubenswrapper[4721]: I0202 13:20:52.089856 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" event={"ID":"1c4864d3-2fdd-4b98-ac89-aefb49b56187","Type":"ContainerStarted","Data":"f367019a0258f466a9f8b33ecbf64c6e7081cb25195b2a05c4d0f997ea19028a"} Feb 02 13:20:52 crc kubenswrapper[4721]: I0202 13:20:52.090422 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" Feb 02 13:20:52 crc kubenswrapper[4721]: I0202 13:20:52.093170 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" event={"ID":"39686eda-a258-408b-bf9c-7ff7d515ed9d","Type":"ContainerStarted","Data":"860752c75bf200f066ae6a9f1173c8d0bcb829e80158db2a8541dec6fec31650"} Feb 02 13:20:52 crc kubenswrapper[4721]: I0202 13:20:52.093377 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" Feb 02 13:20:52 crc kubenswrapper[4721]: I0202 13:20:52.111378 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" podStartSLOduration=5.217866862 podStartE2EDuration="38.111361366s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.987341544 +0000 UTC m=+1158.289855933" lastFinishedPulling="2026-02-02 13:20:50.880836048 +0000 UTC m=+1191.183350437" observedRunningTime="2026-02-02 13:20:52.105230651 +0000 UTC m=+1192.407745050" watchObservedRunningTime="2026-02-02 13:20:52.111361366 +0000 UTC m=+1192.413875755" Feb 02 13:20:52 crc kubenswrapper[4721]: I0202 13:20:52.121309 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" podStartSLOduration=4.743463995 podStartE2EDuration="38.121289834s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.571487508 +0000 UTC m=+1157.874001897" lastFinishedPulling="2026-02-02 13:20:50.949313347 +0000 UTC m=+1191.251827736" observedRunningTime="2026-02-02 13:20:52.121059238 +0000 UTC m=+1192.423573627" watchObservedRunningTime="2026-02-02 13:20:52.121289834 +0000 UTC m=+1192.423804223" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.111435 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" event={"ID":"60ff9309-fd37-4618-b4f0-38704a558ec0","Type":"ContainerStarted","Data":"2607181fbcc276cd8a9313ceac32ed74bac35ecf96c2f108e0d4a96ab4d6cd3f"} Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.112002 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.114193 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" event={"ID":"ae636942-3520-410e-b70a-b4fc19a527ca","Type":"ContainerStarted","Data":"0661209edaa19772a1b3da5c382b5f3c6ba963856a2c619fc467140d78dad240"} Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.114269 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.115863 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" event={"ID":"9d11c3e4-10b4-4ff4-aaa2-04e342d984b4","Type":"ContainerStarted","Data":"952925c167a862b67f4f41cbb9552a39c4b27ef647d400a67ea1ed96f7d94a9f"} Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.116016 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.128380 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" podStartSLOduration=5.125579219 podStartE2EDuration="40.128359164s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.877979761 +0000 UTC m=+1158.180494150" lastFinishedPulling="2026-02-02 13:20:52.880759706 +0000 UTC m=+1193.183274095" observedRunningTime="2026-02-02 13:20:54.126560316 +0000 UTC m=+1194.429074705" watchObservedRunningTime="2026-02-02 13:20:54.128359164 +0000 UTC m=+1194.430873563" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.152752 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" podStartSLOduration=35.220100887 podStartE2EDuration="40.152738322s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:47.947095183 +0000 UTC m=+1188.249609572" lastFinishedPulling="2026-02-02 13:20:52.879732618 +0000 UTC m=+1193.182247007" observedRunningTime="2026-02-02 13:20:54.152129656 +0000 UTC m=+1194.454644045" watchObservedRunningTime="2026-02-02 13:20:54.152738322 +0000 UTC m=+1194.455252711" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.198298 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" podStartSLOduration=35.382840762 podStartE2EDuration="40.198273742s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:48.064128253 +0000 UTC m=+1188.366642642" lastFinishedPulling="2026-02-02 13:20:52.879561233 +0000 UTC m=+1193.182075622" observedRunningTime="2026-02-02 13:20:54.180641345 +0000 UTC m=+1194.483155724" watchObservedRunningTime="2026-02-02 13:20:54.198273742 +0000 UTC m=+1194.500788131" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.388546 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-729mv" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.421009 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-q5lbf" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.640753 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-qqpfm" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.679513 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.681672 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-8zlv5" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.791713 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-s75st" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.823378 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-sq5w5" Feb 02 13:20:54 crc kubenswrapper[4721]: I0202 13:20:54.880498 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-x6p4t" Feb 02 13:20:55 crc kubenswrapper[4721]: I0202 13:20:55.004556 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-sjvjw" Feb 02 13:20:55 crc kubenswrapper[4721]: I0202 13:20:55.042751 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-42qq8" Feb 02 13:20:55 crc kubenswrapper[4721]: I0202 13:20:55.119629 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-rzjts" Feb 02 13:20:55 crc kubenswrapper[4721]: I0202 13:20:55.134606 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" Feb 02 13:20:55 crc kubenswrapper[4721]: I0202 13:20:55.137712 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-kqdjm" Feb 02 13:20:55 crc kubenswrapper[4721]: I0202 13:20:55.358151 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5b9ffd7d65-rgkhb" Feb 02 13:20:55 crc kubenswrapper[4721]: I0202 13:20:55.469283 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-4pk6v" Feb 02 13:20:55 crc kubenswrapper[4721]: I0202 13:20:55.638461 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-79zrv" Feb 02 13:20:57 crc kubenswrapper[4721]: I0202 13:20:57.565955 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5d6c59fb84-5s25p" Feb 02 13:20:59 crc kubenswrapper[4721]: I0202 13:20:59.162893 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" event={"ID":"1f3087b4-acf0-4a27-9696-bdfb4728e96c","Type":"ContainerStarted","Data":"5f2666561bbb7e0fa40c86683a4609b67a11327cccd9e6a4404430aeda5d605a"} Feb 02 13:20:59 crc kubenswrapper[4721]: I0202 13:20:59.163417 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" Feb 02 13:20:59 crc kubenswrapper[4721]: I0202 13:20:59.190094 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" podStartSLOduration=4.273265062 podStartE2EDuration="45.190050492s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.920491169 +0000 UTC m=+1158.223005558" lastFinishedPulling="2026-02-02 13:20:58.837276599 +0000 UTC m=+1199.139790988" observedRunningTime="2026-02-02 13:20:59.180008881 +0000 UTC m=+1199.482523290" watchObservedRunningTime="2026-02-02 13:20:59.190050492 +0000 UTC m=+1199.492564891" Feb 02 13:21:01 crc kubenswrapper[4721]: I0202 13:21:01.182612 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" event={"ID":"b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa","Type":"ContainerStarted","Data":"2f9ba09e62dbc068ec3f14143c3c6fb998bfa7833571f655916f1e818ff9ebc6"} Feb 02 13:21:01 crc kubenswrapper[4721]: I0202 13:21:01.183414 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" Feb 02 13:21:01 crc kubenswrapper[4721]: I0202 13:21:01.199646 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" podStartSLOduration=4.048929307 podStartE2EDuration="47.199611979s" podCreationTimestamp="2026-02-02 13:20:14 +0000 UTC" firstStartedPulling="2026-02-02 13:20:17.805553877 +0000 UTC m=+1158.108068266" lastFinishedPulling="2026-02-02 13:21:00.956236549 +0000 UTC m=+1201.258750938" observedRunningTime="2026-02-02 13:21:01.196864325 +0000 UTC m=+1201.499378724" watchObservedRunningTime="2026-02-02 13:21:01.199611979 +0000 UTC m=+1201.502126368" Feb 02 13:21:04 crc kubenswrapper[4721]: I0202 13:21:04.930278 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-5x28t" Feb 02 13:21:04 crc kubenswrapper[4721]: I0202 13:21:04.987538 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ct6hc" Feb 02 13:21:05 crc kubenswrapper[4721]: I0202 13:21:05.060890 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-6z258" Feb 02 13:21:05 crc kubenswrapper[4721]: I0202 13:21:05.696434 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-2828d" Feb 02 13:21:06 crc kubenswrapper[4721]: I0202 13:21:06.714565 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hktcl" Feb 02 13:21:06 crc kubenswrapper[4721]: I0202 13:21:06.881196 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw" Feb 02 13:21:14 crc kubenswrapper[4721]: I0202 13:21:14.764714 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:21:14 crc kubenswrapper[4721]: I0202 13:21:14.765593 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:21:14 crc kubenswrapper[4721]: I0202 13:21:14.969458 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-5vbh8" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.392397 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dx98r"] Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.406831 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.412447 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-9bfjx" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.412900 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.412927 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.413196 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.429183 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pk9z\" (UniqueName: \"kubernetes.io/projected/b256785d-0ae0-454d-8927-a28668507e06-kube-api-access-6pk9z\") pod \"dnsmasq-dns-675f4bcbfc-dx98r\" (UID: \"b256785d-0ae0-454d-8927-a28668507e06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.429420 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b256785d-0ae0-454d-8927-a28668507e06-config\") pod \"dnsmasq-dns-675f4bcbfc-dx98r\" (UID: \"b256785d-0ae0-454d-8927-a28668507e06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.447943 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dx98r"] Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.480885 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppnbl"] Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.482449 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.484442 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.504897 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppnbl"] Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.530770 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b256785d-0ae0-454d-8927-a28668507e06-config\") pod \"dnsmasq-dns-675f4bcbfc-dx98r\" (UID: \"b256785d-0ae0-454d-8927-a28668507e06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.531144 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdpg\" (UniqueName: \"kubernetes.io/projected/ded37b19-4830-4437-9a53-778f826f3582-kube-api-access-hzdpg\") pod \"dnsmasq-dns-78dd6ddcc-ppnbl\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.531191 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-config\") pod \"dnsmasq-dns-78dd6ddcc-ppnbl\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.531225 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ppnbl\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.531248 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pk9z\" (UniqueName: \"kubernetes.io/projected/b256785d-0ae0-454d-8927-a28668507e06-kube-api-access-6pk9z\") pod \"dnsmasq-dns-675f4bcbfc-dx98r\" (UID: \"b256785d-0ae0-454d-8927-a28668507e06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.533205 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b256785d-0ae0-454d-8927-a28668507e06-config\") pod \"dnsmasq-dns-675f4bcbfc-dx98r\" (UID: \"b256785d-0ae0-454d-8927-a28668507e06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.556147 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pk9z\" (UniqueName: \"kubernetes.io/projected/b256785d-0ae0-454d-8927-a28668507e06-kube-api-access-6pk9z\") pod \"dnsmasq-dns-675f4bcbfc-dx98r\" (UID: \"b256785d-0ae0-454d-8927-a28668507e06\") " pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.632357 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-config\") pod \"dnsmasq-dns-78dd6ddcc-ppnbl\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.632411 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ppnbl\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.632516 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzdpg\" (UniqueName: \"kubernetes.io/projected/ded37b19-4830-4437-9a53-778f826f3582-kube-api-access-hzdpg\") pod \"dnsmasq-dns-78dd6ddcc-ppnbl\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.666384 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzdpg\" (UniqueName: \"kubernetes.io/projected/ded37b19-4830-4437-9a53-778f826f3582-kube-api-access-hzdpg\") pod \"dnsmasq-dns-78dd6ddcc-ppnbl\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.677686 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-config\") pod \"dnsmasq-dns-78dd6ddcc-ppnbl\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.677713 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ppnbl\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.737754 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:31 crc kubenswrapper[4721]: I0202 13:21:31.801085 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:32 crc kubenswrapper[4721]: W0202 13:21:32.228807 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb256785d_0ae0_454d_8927_a28668507e06.slice/crio-ffe8ae8e550e5fe561d651b70087eeacc3e6b311c6a4ad871733c9d878db06b3 WatchSource:0}: Error finding container ffe8ae8e550e5fe561d651b70087eeacc3e6b311c6a4ad871733c9d878db06b3: Status 404 returned error can't find the container with id ffe8ae8e550e5fe561d651b70087eeacc3e6b311c6a4ad871733c9d878db06b3 Feb 02 13:21:32 crc kubenswrapper[4721]: I0202 13:21:32.234472 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dx98r"] Feb 02 13:21:32 crc kubenswrapper[4721]: I0202 13:21:32.330733 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppnbl"] Feb 02 13:21:32 crc kubenswrapper[4721]: W0202 13:21:32.335221 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded37b19_4830_4437_9a53_778f826f3582.slice/crio-e4ecfe7e04cb7aa8d37b9511552603f7febbe5a304c908615695a3cf3f4a961f WatchSource:0}: Error finding container e4ecfe7e04cb7aa8d37b9511552603f7febbe5a304c908615695a3cf3f4a961f: Status 404 returned error can't find the container with id e4ecfe7e04cb7aa8d37b9511552603f7febbe5a304c908615695a3cf3f4a961f Feb 02 13:21:32 crc kubenswrapper[4721]: I0202 13:21:32.439330 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" event={"ID":"ded37b19-4830-4437-9a53-778f826f3582","Type":"ContainerStarted","Data":"e4ecfe7e04cb7aa8d37b9511552603f7febbe5a304c908615695a3cf3f4a961f"} Feb 02 13:21:32 crc kubenswrapper[4721]: I0202 13:21:32.442030 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" event={"ID":"b256785d-0ae0-454d-8927-a28668507e06","Type":"ContainerStarted","Data":"ffe8ae8e550e5fe561d651b70087eeacc3e6b311c6a4ad871733c9d878db06b3"} Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.313605 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dx98r"] Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.338761 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h9xj9"] Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.340697 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.352376 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h9xj9"] Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.483920 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-config\") pod \"dnsmasq-dns-666b6646f7-h9xj9\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.484288 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp8wm\" (UniqueName: \"kubernetes.io/projected/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-kube-api-access-gp8wm\") pod \"dnsmasq-dns-666b6646f7-h9xj9\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.484330 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-dns-svc\") pod \"dnsmasq-dns-666b6646f7-h9xj9\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.588227 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp8wm\" (UniqueName: \"kubernetes.io/projected/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-kube-api-access-gp8wm\") pod \"dnsmasq-dns-666b6646f7-h9xj9\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.589776 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-dns-svc\") pod \"dnsmasq-dns-666b6646f7-h9xj9\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.589858 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-dns-svc\") pod \"dnsmasq-dns-666b6646f7-h9xj9\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.589986 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-config\") pod \"dnsmasq-dns-666b6646f7-h9xj9\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.591926 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-config\") pod \"dnsmasq-dns-666b6646f7-h9xj9\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.631090 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp8wm\" (UniqueName: \"kubernetes.io/projected/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-kube-api-access-gp8wm\") pod \"dnsmasq-dns-666b6646f7-h9xj9\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.659549 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppnbl"] Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.666293 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.687008 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ljjc5"] Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.689760 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.711382 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ljjc5"] Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.799262 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ljjc5\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.799395 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpqwx\" (UniqueName: \"kubernetes.io/projected/f9598cea-e831-47fd-aa1a-08060e23bba2-kube-api-access-rpqwx\") pod \"dnsmasq-dns-57d769cc4f-ljjc5\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.799428 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-config\") pod \"dnsmasq-dns-57d769cc4f-ljjc5\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.903130 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpqwx\" (UniqueName: \"kubernetes.io/projected/f9598cea-e831-47fd-aa1a-08060e23bba2-kube-api-access-rpqwx\") pod \"dnsmasq-dns-57d769cc4f-ljjc5\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.903552 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-config\") pod \"dnsmasq-dns-57d769cc4f-ljjc5\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.903677 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ljjc5\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.904707 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ljjc5\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.905083 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-config\") pod \"dnsmasq-dns-57d769cc4f-ljjc5\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:34 crc kubenswrapper[4721]: I0202 13:21:34.930896 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpqwx\" (UniqueName: \"kubernetes.io/projected/f9598cea-e831-47fd-aa1a-08060e23bba2-kube-api-access-rpqwx\") pod \"dnsmasq-dns-57d769cc4f-ljjc5\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.147397 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.320562 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h9xj9"] Feb 02 13:21:35 crc kubenswrapper[4721]: W0202 13:21:35.321220 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod242f2c9f_2150_4d1a_8c40_9e34f1ffc5ff.slice/crio-f45d2f8df0c32334c973de797a13b02d6fc09764b2e977590919ddc80c08dc8f WatchSource:0}: Error finding container f45d2f8df0c32334c973de797a13b02d6fc09764b2e977590919ddc80c08dc8f: Status 404 returned error can't find the container with id f45d2f8df0c32334c973de797a13b02d6fc09764b2e977590919ddc80c08dc8f Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.471060 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.473179 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.475821 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.476569 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.476751 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.477015 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.477206 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.477208 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rc7jr" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.477232 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.506540 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.516120 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517206 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a57cea33-806c-4028-b59f-9f5e65289eac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517315 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a57cea33-806c-4028-b59f-9f5e65289eac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517350 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517376 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517492 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkd9k\" (UniqueName: \"kubernetes.io/projected/a57cea33-806c-4028-b59f-9f5e65289eac-kube-api-access-zkd9k\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517613 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5ffe0936-af63-4f85-b1ec-8b9cf7b1c000\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ffe0936-af63-4f85-b1ec-8b9cf7b1c000\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517643 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a57cea33-806c-4028-b59f-9f5e65289eac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517689 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a57cea33-806c-4028-b59f-9f5e65289eac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517719 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a57cea33-806c-4028-b59f-9f5e65289eac-config-data\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517744 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.517768 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.519201 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" event={"ID":"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff","Type":"ContainerStarted","Data":"f45d2f8df0c32334c973de797a13b02d6fc09764b2e977590919ddc80c08dc8f"} Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.519310 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.524992 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.526946 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.534525 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.545821 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.620580 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5ffe0936-af63-4f85-b1ec-8b9cf7b1c000\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ffe0936-af63-4f85-b1ec-8b9cf7b1c000\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.620649 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a57cea33-806c-4028-b59f-9f5e65289eac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.620737 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a57cea33-806c-4028-b59f-9f5e65289eac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.620782 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a57cea33-806c-4028-b59f-9f5e65289eac-config-data\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.620803 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.620819 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.620937 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a57cea33-806c-4028-b59f-9f5e65289eac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.621014 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a57cea33-806c-4028-b59f-9f5e65289eac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.621045 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.621091 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.621123 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkd9k\" (UniqueName: \"kubernetes.io/projected/a57cea33-806c-4028-b59f-9f5e65289eac-kube-api-access-zkd9k\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.622511 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a57cea33-806c-4028-b59f-9f5e65289eac-config-data\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.623703 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a57cea33-806c-4028-b59f-9f5e65289eac-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.624204 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a57cea33-806c-4028-b59f-9f5e65289eac-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.625112 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.630269 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.630671 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.630703 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5ffe0936-af63-4f85-b1ec-8b9cf7b1c000\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ffe0936-af63-4f85-b1ec-8b9cf7b1c000\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f768cdfd31e4466d54219143b93bec72e999993fecffa835aa705bdc902f25fb/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.631556 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.632428 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a57cea33-806c-4028-b59f-9f5e65289eac-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.633145 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a57cea33-806c-4028-b59f-9f5e65289eac-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.633178 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a57cea33-806c-4028-b59f-9f5e65289eac-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.638796 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ljjc5"] Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.645417 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkd9k\" (UniqueName: \"kubernetes.io/projected/a57cea33-806c-4028-b59f-9f5e65289eac-kube-api-access-zkd9k\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.663868 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5ffe0936-af63-4f85-b1ec-8b9cf7b1c000\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ffe0936-af63-4f85-b1ec-8b9cf7b1c000\") pod \"rabbitmq-server-0\" (UID: \"a57cea33-806c-4028-b59f-9f5e65289eac\") " pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.722967 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.723339 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.723368 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c21ba8be-1b43-4a8e-b03c-2a4e8cecb516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c21ba8be-1b43-4a8e-b03c-2a4e8cecb516\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724162 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724257 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bdnv\" (UniqueName: \"kubernetes.io/projected/4d21d961-1540-4610-89c0-ee265f66d728-kube-api-access-7bdnv\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724275 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724335 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d21d961-1540-4610-89c0-ee265f66d728-pod-info\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724402 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724428 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a64b0146-2b46-4525-a4ab-0ae755e09ffc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a64b0146-2b46-4525-a4ab-0ae755e09ffc\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724460 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724509 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6dbd607-3fa8-48e0-b420-4e939a47c460-server-conf\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724532 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.724625 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6dbd607-3fa8-48e0-b420-4e939a47c460-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.726757 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6dbd607-3fa8-48e0-b420-4e939a47c460-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.726828 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d21d961-1540-4610-89c0-ee265f66d728-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.726857 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d21d961-1540-4610-89c0-ee265f66d728-server-conf\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.726906 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6dbd607-3fa8-48e0-b420-4e939a47c460-config-data\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.726957 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d21d961-1540-4610-89c0-ee265f66d728-config-data\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.726980 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.727038 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6dbd607-3fa8-48e0-b420-4e939a47c460-pod-info\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.727097 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d21d961-1540-4610-89c0-ee265f66d728-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.727128 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hj7s\" (UniqueName: \"kubernetes.io/projected/b6dbd607-3fa8-48e0-b420-4e939a47c460-kube-api-access-9hj7s\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.824794 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.828741 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d21d961-1540-4610-89c0-ee265f66d728-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.829781 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d21d961-1540-4610-89c0-ee265f66d728-server-conf\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.829740 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d21d961-1540-4610-89c0-ee265f66d728-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.829857 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6dbd607-3fa8-48e0-b420-4e939a47c460-config-data\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.829887 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d21d961-1540-4610-89c0-ee265f66d728-config-data\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.831015 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d21d961-1540-4610-89c0-ee265f66d728-server-conf\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.831095 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d21d961-1540-4610-89c0-ee265f66d728-config-data\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.831180 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b6dbd607-3fa8-48e0-b420-4e939a47c460-config-data\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.832747 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833384 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6dbd607-3fa8-48e0-b420-4e939a47c460-pod-info\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833422 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d21d961-1540-4610-89c0-ee265f66d728-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833453 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj7s\" (UniqueName: \"kubernetes.io/projected/b6dbd607-3fa8-48e0-b420-4e939a47c460-kube-api-access-9hj7s\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833535 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833573 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833601 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c21ba8be-1b43-4a8e-b03c-2a4e8cecb516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c21ba8be-1b43-4a8e-b03c-2a4e8cecb516\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833623 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833701 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bdnv\" (UniqueName: \"kubernetes.io/projected/4d21d961-1540-4610-89c0-ee265f66d728-kube-api-access-7bdnv\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833726 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833763 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d21d961-1540-4610-89c0-ee265f66d728-pod-info\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833803 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833829 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a64b0146-2b46-4525-a4ab-0ae755e09ffc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a64b0146-2b46-4525-a4ab-0ae755e09ffc\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833856 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833921 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6dbd607-3fa8-48e0-b420-4e939a47c460-server-conf\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.833950 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.834017 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6dbd607-3fa8-48e0-b420-4e939a47c460-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.834141 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6dbd607-3fa8-48e0-b420-4e939a47c460-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.834944 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b6dbd607-3fa8-48e0-b420-4e939a47c460-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.835235 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.835272 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.836338 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b6dbd607-3fa8-48e0-b420-4e939a47c460-server-conf\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.836680 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.837160 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b6dbd607-3fa8-48e0-b420-4e939a47c460-pod-info\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.837476 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.839845 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d21d961-1540-4610-89c0-ee265f66d728-pod-info\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.845885 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b6dbd607-3fa8-48e0-b420-4e939a47c460-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.845895 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.845955 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a64b0146-2b46-4525-a4ab-0ae755e09ffc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a64b0146-2b46-4525-a4ab-0ae755e09ffc\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f75a550d0755a9bbbcdd150a61fd98bebb2f378fbc6fe94c4a9320c6ab4aa089/globalmount\"" pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.845964 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d21d961-1540-4610-89c0-ee265f66d728-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.846054 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.846084 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.846531 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d21d961-1540-4610-89c0-ee265f66d728-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.846760 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.846793 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c21ba8be-1b43-4a8e-b03c-2a4e8cecb516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c21ba8be-1b43-4a8e-b03c-2a4e8cecb516\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/82d0d0f3d0e2d4632b5139e9a2e3120dee94db4e9ede7c4dd6d6473a90916d83/globalmount\"" pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.846967 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b6dbd607-3fa8-48e0-b420-4e939a47c460-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.855266 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bdnv\" (UniqueName: \"kubernetes.io/projected/4d21d961-1540-4610-89c0-ee265f66d728-kube-api-access-7bdnv\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.856701 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hj7s\" (UniqueName: \"kubernetes.io/projected/b6dbd607-3fa8-48e0-b420-4e939a47c460-kube-api-access-9hj7s\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.863225 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.864890 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.869769 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.869884 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.870168 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.870479 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-r4bj5" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.870613 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.870660 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.870731 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.876765 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.896119 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c21ba8be-1b43-4a8e-b03c-2a4e8cecb516\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c21ba8be-1b43-4a8e-b03c-2a4e8cecb516\") pod \"rabbitmq-server-2\" (UID: \"b6dbd607-3fa8-48e0-b420-4e939a47c460\") " pod="openstack/rabbitmq-server-2" Feb 02 13:21:35 crc kubenswrapper[4721]: I0202 13:21:35.969711 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a64b0146-2b46-4525-a4ab-0ae755e09ffc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a64b0146-2b46-4525-a4ab-0ae755e09ffc\") pod \"rabbitmq-server-1\" (UID: \"4d21d961-1540-4610-89c0-ee265f66d728\") " pod="openstack/rabbitmq-server-1" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045080 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045149 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/496bb19e-217b-4896-9bee-8082ac5da28b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045180 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045204 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/496bb19e-217b-4896-9bee-8082ac5da28b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045227 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npm8v\" (UniqueName: \"kubernetes.io/projected/496bb19e-217b-4896-9bee-8082ac5da28b-kube-api-access-npm8v\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045244 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045277 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/496bb19e-217b-4896-9bee-8082ac5da28b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045297 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045320 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/496bb19e-217b-4896-9bee-8082ac5da28b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045349 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/496bb19e-217b-4896-9bee-8082ac5da28b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.045372 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e97c06f6-7cba-4e06-8735-05d7d5af9e90\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e97c06f6-7cba-4e06-8735-05d7d5af9e90\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151173 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/496bb19e-217b-4896-9bee-8082ac5da28b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151260 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/496bb19e-217b-4896-9bee-8082ac5da28b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151284 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151321 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151346 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npm8v\" (UniqueName: \"kubernetes.io/projected/496bb19e-217b-4896-9bee-8082ac5da28b-kube-api-access-npm8v\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151397 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/496bb19e-217b-4896-9bee-8082ac5da28b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151426 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151455 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/496bb19e-217b-4896-9bee-8082ac5da28b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151480 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/496bb19e-217b-4896-9bee-8082ac5da28b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151516 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e97c06f6-7cba-4e06-8735-05d7d5af9e90\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e97c06f6-7cba-4e06-8735-05d7d5af9e90\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.151617 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.152173 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.153785 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/496bb19e-217b-4896-9bee-8082ac5da28b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.154568 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/496bb19e-217b-4896-9bee-8082ac5da28b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.155774 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.170963 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.171880 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/496bb19e-217b-4896-9bee-8082ac5da28b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.171924 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.173918 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/496bb19e-217b-4896-9bee-8082ac5da28b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.175871 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/496bb19e-217b-4896-9bee-8082ac5da28b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.176328 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.197657 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/496bb19e-217b-4896-9bee-8082ac5da28b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.232722 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.232961 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e97c06f6-7cba-4e06-8735-05d7d5af9e90\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e97c06f6-7cba-4e06-8735-05d7d5af9e90\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/73a1434e638350512a0f0c04a2f3b8af25c25424644f85a28328e741ff86171d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.250138 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npm8v\" (UniqueName: \"kubernetes.io/projected/496bb19e-217b-4896-9bee-8082ac5da28b-kube-api-access-npm8v\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.293346 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e97c06f6-7cba-4e06-8735-05d7d5af9e90\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e97c06f6-7cba-4e06-8735-05d7d5af9e90\") pod \"rabbitmq-cell1-server-0\" (UID: \"496bb19e-217b-4896-9bee-8082ac5da28b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:36 crc kubenswrapper[4721]: I0202 13:21:36.562675 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.271485 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.273103 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.278507 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-hs7j6" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.278908 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.279017 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.279148 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.287343 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.290477 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.377479 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-config-data-default\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.378033 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.378263 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.378296 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-kolla-config\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.378566 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.378898 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbksr\" (UniqueName: \"kubernetes.io/projected/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-kube-api-access-kbksr\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.379155 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-67f27995-a569-40a0-9fab-b26d2604807c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f27995-a569-40a0-9fab-b26d2604807c\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.379282 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.480960 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.481049 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbksr\" (UniqueName: \"kubernetes.io/projected/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-kube-api-access-kbksr\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.481136 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-67f27995-a569-40a0-9fab-b26d2604807c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f27995-a569-40a0-9fab-b26d2604807c\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.481177 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.481208 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-config-data-default\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.481233 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.481266 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.481296 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-kolla-config\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.482018 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.482253 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-kolla-config\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.482537 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-config-data-default\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.483180 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.485301 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.485836 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.506749 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbksr\" (UniqueName: \"kubernetes.io/projected/f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4-kube-api-access-kbksr\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.509234 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.509287 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-67f27995-a569-40a0-9fab-b26d2604807c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f27995-a569-40a0-9fab-b26d2604807c\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/02cfc071c80aa38381e977cb3a0a7f7727c6750de0c68f48e91714708c5e03a5/globalmount\"" pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.539651 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-67f27995-a569-40a0-9fab-b26d2604807c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67f27995-a569-40a0-9fab-b26d2604807c\") pod \"openstack-galera-0\" (UID: \"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4\") " pod="openstack/openstack-galera-0" Feb 02 13:21:37 crc kubenswrapper[4721]: I0202 13:21:37.603552 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.731806 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.733761 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.735968 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.736420 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-z2vw8" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.736604 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.741909 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.751544 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.908429 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2505bd6b-64d4-4d17-9c1a-0e89562612be-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.908478 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8564e4e5-bb86-4518-8e65-498ae90b7207\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8564e4e5-bb86-4518-8e65-498ae90b7207\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.908529 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2505bd6b-64d4-4d17-9c1a-0e89562612be-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.908565 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgtvc\" (UniqueName: \"kubernetes.io/projected/2505bd6b-64d4-4d17-9c1a-0e89562612be-kube-api-access-pgtvc\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.908590 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2505bd6b-64d4-4d17-9c1a-0e89562612be-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.908633 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2505bd6b-64d4-4d17-9c1a-0e89562612be-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.908657 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2505bd6b-64d4-4d17-9c1a-0e89562612be-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.908680 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2505bd6b-64d4-4d17-9c1a-0e89562612be-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.937352 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.939225 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.943666 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-q9dzm" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.943688 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.943799 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 02 13:21:38 crc kubenswrapper[4721]: I0202 13:21:38.958184 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.011338 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2505bd6b-64d4-4d17-9c1a-0e89562612be-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.011398 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2505bd6b-64d4-4d17-9c1a-0e89562612be-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.011437 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2505bd6b-64d4-4d17-9c1a-0e89562612be-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.011570 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2505bd6b-64d4-4d17-9c1a-0e89562612be-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.011602 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8564e4e5-bb86-4518-8e65-498ae90b7207\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8564e4e5-bb86-4518-8e65-498ae90b7207\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.011639 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2505bd6b-64d4-4d17-9c1a-0e89562612be-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.011678 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgtvc\" (UniqueName: \"kubernetes.io/projected/2505bd6b-64d4-4d17-9c1a-0e89562612be-kube-api-access-pgtvc\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.011710 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2505bd6b-64d4-4d17-9c1a-0e89562612be-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.012051 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2505bd6b-64d4-4d17-9c1a-0e89562612be-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.012582 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2505bd6b-64d4-4d17-9c1a-0e89562612be-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.012637 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2505bd6b-64d4-4d17-9c1a-0e89562612be-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.014011 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2505bd6b-64d4-4d17-9c1a-0e89562612be-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.015922 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.015954 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8564e4e5-bb86-4518-8e65-498ae90b7207\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8564e4e5-bb86-4518-8e65-498ae90b7207\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/95677eff6a9bdedc5b505c2f3bdc046d1b5b0ef5e3e705c1b7f37b1888d23524/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.018305 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2505bd6b-64d4-4d17-9c1a-0e89562612be-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.032761 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2505bd6b-64d4-4d17-9c1a-0e89562612be-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.032934 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgtvc\" (UniqueName: \"kubernetes.io/projected/2505bd6b-64d4-4d17-9c1a-0e89562612be-kube-api-access-pgtvc\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.073371 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8564e4e5-bb86-4518-8e65-498ae90b7207\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8564e4e5-bb86-4518-8e65-498ae90b7207\") pod \"openstack-cell1-galera-0\" (UID: \"2505bd6b-64d4-4d17-9c1a-0e89562612be\") " pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.113404 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a686ac60-f231-4070-98c7-7acbc66c29d5-kolla-config\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.113503 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a686ac60-f231-4070-98c7-7acbc66c29d5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.113556 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a686ac60-f231-4070-98c7-7acbc66c29d5-config-data\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.113596 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shkz9\" (UniqueName: \"kubernetes.io/projected/a686ac60-f231-4070-98c7-7acbc66c29d5-kube-api-access-shkz9\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.113634 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a686ac60-f231-4070-98c7-7acbc66c29d5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.214797 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a686ac60-f231-4070-98c7-7acbc66c29d5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.214881 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a686ac60-f231-4070-98c7-7acbc66c29d5-config-data\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.214927 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shkz9\" (UniqueName: \"kubernetes.io/projected/a686ac60-f231-4070-98c7-7acbc66c29d5-kube-api-access-shkz9\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.214976 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a686ac60-f231-4070-98c7-7acbc66c29d5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.215040 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a686ac60-f231-4070-98c7-7acbc66c29d5-kolla-config\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.215958 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a686ac60-f231-4070-98c7-7acbc66c29d5-kolla-config\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.216515 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a686ac60-f231-4070-98c7-7acbc66c29d5-config-data\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.221126 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a686ac60-f231-4070-98c7-7acbc66c29d5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.221496 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a686ac60-f231-4070-98c7-7acbc66c29d5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.235877 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shkz9\" (UniqueName: \"kubernetes.io/projected/a686ac60-f231-4070-98c7-7acbc66c29d5-kube-api-access-shkz9\") pod \"memcached-0\" (UID: \"a686ac60-f231-4070-98c7-7acbc66c29d5\") " pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.256224 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 13:21:39 crc kubenswrapper[4721]: I0202 13:21:39.362474 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 13:21:40 crc kubenswrapper[4721]: W0202 13:21:40.087712 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9598cea_e831_47fd_aa1a_08060e23bba2.slice/crio-dcd5d79dba084b60aa8e85cba431f52f20415983b8d07817faa29db86eef35ea WatchSource:0}: Error finding container dcd5d79dba084b60aa8e85cba431f52f20415983b8d07817faa29db86eef35ea: Status 404 returned error can't find the container with id dcd5d79dba084b60aa8e85cba431f52f20415983b8d07817faa29db86eef35ea Feb 02 13:21:40 crc kubenswrapper[4721]: I0202 13:21:40.572021 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" event={"ID":"f9598cea-e831-47fd-aa1a-08060e23bba2","Type":"ContainerStarted","Data":"dcd5d79dba084b60aa8e85cba431f52f20415983b8d07817faa29db86eef35ea"} Feb 02 13:21:41 crc kubenswrapper[4721]: I0202 13:21:41.076544 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 13:21:41 crc kubenswrapper[4721]: I0202 13:21:41.079231 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 13:21:41 crc kubenswrapper[4721]: I0202 13:21:41.089097 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-78vw2" Feb 02 13:21:41 crc kubenswrapper[4721]: I0202 13:21:41.122114 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 13:21:41 crc kubenswrapper[4721]: I0202 13:21:41.154253 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlrn8\" (UniqueName: \"kubernetes.io/projected/cc071000-a602-4de6-a9bc-1c93b6d58c25-kube-api-access-wlrn8\") pod \"kube-state-metrics-0\" (UID: \"cc071000-a602-4de6-a9bc-1c93b6d58c25\") " pod="openstack/kube-state-metrics-0" Feb 02 13:21:41 crc kubenswrapper[4721]: I0202 13:21:41.275321 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlrn8\" (UniqueName: \"kubernetes.io/projected/cc071000-a602-4de6-a9bc-1c93b6d58c25-kube-api-access-wlrn8\") pod \"kube-state-metrics-0\" (UID: \"cc071000-a602-4de6-a9bc-1c93b6d58c25\") " pod="openstack/kube-state-metrics-0" Feb 02 13:21:41 crc kubenswrapper[4721]: I0202 13:21:41.338350 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlrn8\" (UniqueName: \"kubernetes.io/projected/cc071000-a602-4de6-a9bc-1c93b6d58c25-kube-api-access-wlrn8\") pod \"kube-state-metrics-0\" (UID: \"cc071000-a602-4de6-a9bc-1c93b6d58c25\") " pod="openstack/kube-state-metrics-0" Feb 02 13:21:41 crc kubenswrapper[4721]: I0202 13:21:41.454474 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.036931 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx"] Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.038152 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.042173 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.042354 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-vlt46" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.050621 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx"] Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.193438 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lp8f\" (UniqueName: \"kubernetes.io/projected/6064a9a4-2316-4bdd-abf1-934e9167528a-kube-api-access-7lp8f\") pod \"observability-ui-dashboards-66cbf594b5-6lvhx\" (UID: \"6064a9a4-2316-4bdd-abf1-934e9167528a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.193765 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6064a9a4-2316-4bdd-abf1-934e9167528a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-6lvhx\" (UID: \"6064a9a4-2316-4bdd-abf1-934e9167528a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.296007 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lp8f\" (UniqueName: \"kubernetes.io/projected/6064a9a4-2316-4bdd-abf1-934e9167528a-kube-api-access-7lp8f\") pod \"observability-ui-dashboards-66cbf594b5-6lvhx\" (UID: \"6064a9a4-2316-4bdd-abf1-934e9167528a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.296081 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6064a9a4-2316-4bdd-abf1-934e9167528a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-6lvhx\" (UID: \"6064a9a4-2316-4bdd-abf1-934e9167528a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" Feb 02 13:21:42 crc kubenswrapper[4721]: E0202 13:21:42.296231 4721 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Feb 02 13:21:42 crc kubenswrapper[4721]: E0202 13:21:42.296306 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6064a9a4-2316-4bdd-abf1-934e9167528a-serving-cert podName:6064a9a4-2316-4bdd-abf1-934e9167528a nodeName:}" failed. No retries permitted until 2026-02-02 13:21:42.796278648 +0000 UTC m=+1243.098793037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6064a9a4-2316-4bdd-abf1-934e9167528a-serving-cert") pod "observability-ui-dashboards-66cbf594b5-6lvhx" (UID: "6064a9a4-2316-4bdd-abf1-934e9167528a") : secret "observability-ui-dashboards" not found Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.323822 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lp8f\" (UniqueName: \"kubernetes.io/projected/6064a9a4-2316-4bdd-abf1-934e9167528a-kube-api-access-7lp8f\") pod \"observability-ui-dashboards-66cbf594b5-6lvhx\" (UID: \"6064a9a4-2316-4bdd-abf1-934e9167528a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.431821 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cc4f8f495-dxd8x"] Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.433687 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.459305 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.462889 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.468742 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.469366 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.469647 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xt8ls" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.470251 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.471532 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.472366 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.481505 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.481690 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.532189 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc4f8f495-dxd8x"] Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.566392 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601141 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbz2v\" (UniqueName: \"kubernetes.io/projected/20201931-5a9c-4f86-ad4d-1df122372f8a-kube-api-access-mbz2v\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601222 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601249 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601267 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-trusted-ca-bundle\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601285 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20201931-5a9c-4f86-ad4d-1df122372f8a-console-serving-cert\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601306 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5th\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-kube-api-access-nq5th\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601321 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-console-config\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601349 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601393 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/754114a2-a012-43fe-923b-a8cc3df91aa0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601416 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601469 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-config\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601489 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20201931-5a9c-4f86-ad4d-1df122372f8a-console-oauth-config\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601527 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601560 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601576 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-service-ca\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601592 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-oauth-serving-cert\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.601611 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.703433 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.703506 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.703528 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-service-ca\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.703549 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-oauth-serving-cert\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.703591 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.703614 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbz2v\" (UniqueName: \"kubernetes.io/projected/20201931-5a9c-4f86-ad4d-1df122372f8a-kube-api-access-mbz2v\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.704547 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-oauth-serving-cert\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.704665 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-service-ca\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.704701 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.704764 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-trusted-ca-bundle\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.704797 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.704830 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20201931-5a9c-4f86-ad4d-1df122372f8a-console-serving-cert\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.704863 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5th\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-kube-api-access-nq5th\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.704883 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-console-config\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.704927 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.705043 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/754114a2-a012-43fe-923b-a8cc3df91aa0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.705407 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.705456 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-config\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.705503 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20201931-5a9c-4f86-ad4d-1df122372f8a-console-oauth-config\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.705733 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-console-config\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.705862 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.706128 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20201931-5a9c-4f86-ad4d-1df122372f8a-trusted-ca-bundle\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.706632 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.706728 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.716815 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.717736 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/754114a2-a012-43fe-923b-a8cc3df91aa0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.720653 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.720863 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/20201931-5a9c-4f86-ad4d-1df122372f8a-console-oauth-config\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.724380 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-config\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.726583 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.727447 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5th\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-kube-api-access-nq5th\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.735902 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.735946 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b8f571356fe2cb3489fcae1580a11d6ada33fdec8c8d1e0850e45e91197c9652/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.736012 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbz2v\" (UniqueName: \"kubernetes.io/projected/20201931-5a9c-4f86-ad4d-1df122372f8a-kube-api-access-mbz2v\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.747648 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/20201931-5a9c-4f86-ad4d-1df122372f8a-console-serving-cert\") pod \"console-7cc4f8f495-dxd8x\" (UID: \"20201931-5a9c-4f86-ad4d-1df122372f8a\") " pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.768289 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.811993 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6064a9a4-2316-4bdd-abf1-934e9167528a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-6lvhx\" (UID: \"6064a9a4-2316-4bdd-abf1-934e9167528a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.814396 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") pod \"prometheus-metric-storage-0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.821030 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6064a9a4-2316-4bdd-abf1-934e9167528a-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-6lvhx\" (UID: \"6064a9a4-2316-4bdd-abf1-934e9167528a\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" Feb 02 13:21:42 crc kubenswrapper[4721]: I0202 13:21:42.965695 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.110685 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.757030 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.759057 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.764759 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.764905 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.765108 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5jqnn" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.765213 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.765379 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.789574 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.832263 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pzpz\" (UniqueName: \"kubernetes.io/projected/080bfc29-50bc-4ba1-b097-4f5c54586d8c-kube-api-access-5pzpz\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.832583 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/080bfc29-50bc-4ba1-b097-4f5c54586d8c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.832610 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/080bfc29-50bc-4ba1-b097-4f5c54586d8c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.832641 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/080bfc29-50bc-4ba1-b097-4f5c54586d8c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.832705 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/080bfc29-50bc-4ba1-b097-4f5c54586d8c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.832760 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080bfc29-50bc-4ba1-b097-4f5c54586d8c-config\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.832793 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f242d59e-e98d-4b10-969c-f41a7a663807\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f242d59e-e98d-4b10-969c-f41a7a663807\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.832848 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/080bfc29-50bc-4ba1-b097-4f5c54586d8c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.934552 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080bfc29-50bc-4ba1-b097-4f5c54586d8c-config\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.935920 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f242d59e-e98d-4b10-969c-f41a7a663807\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f242d59e-e98d-4b10-969c-f41a7a663807\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.936209 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/080bfc29-50bc-4ba1-b097-4f5c54586d8c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.936466 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pzpz\" (UniqueName: \"kubernetes.io/projected/080bfc29-50bc-4ba1-b097-4f5c54586d8c-kube-api-access-5pzpz\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.936649 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/080bfc29-50bc-4ba1-b097-4f5c54586d8c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.936755 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/080bfc29-50bc-4ba1-b097-4f5c54586d8c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.936970 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/080bfc29-50bc-4ba1-b097-4f5c54586d8c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.935855 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/080bfc29-50bc-4ba1-b097-4f5c54586d8c-config\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.938521 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/080bfc29-50bc-4ba1-b097-4f5c54586d8c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.939793 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.939846 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f242d59e-e98d-4b10-969c-f41a7a663807\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f242d59e-e98d-4b10-969c-f41a7a663807\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e57f4511cabc27666ca4cabeef41edec6a12ceaf849ba65a34eca7aaee98ae69/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.940245 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/080bfc29-50bc-4ba1-b097-4f5c54586d8c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.941757 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/080bfc29-50bc-4ba1-b097-4f5c54586d8c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.948235 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/080bfc29-50bc-4ba1-b097-4f5c54586d8c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.948850 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/080bfc29-50bc-4ba1-b097-4f5c54586d8c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.950445 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/080bfc29-50bc-4ba1-b097-4f5c54586d8c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.965246 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pzpz\" (UniqueName: \"kubernetes.io/projected/080bfc29-50bc-4ba1-b097-4f5c54586d8c-kube-api-access-5pzpz\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:43 crc kubenswrapper[4721]: I0202 13:21:43.986822 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f242d59e-e98d-4b10-969c-f41a7a663807\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f242d59e-e98d-4b10-969c-f41a7a663807\") pod \"ovsdbserver-nb-0\" (UID: \"080bfc29-50bc-4ba1-b097-4f5c54586d8c\") " pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:44 crc kubenswrapper[4721]: I0202 13:21:44.100914 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 13:21:44 crc kubenswrapper[4721]: I0202 13:21:44.763463 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:21:44 crc kubenswrapper[4721]: I0202 13:21:44.763514 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.586468 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-l5h78"] Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.589406 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.594957 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.595030 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.595971 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-bg96d" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.607148 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l5h78"] Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.636012 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gz9nz"] Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.638230 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.651920 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gz9nz"] Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.682931 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22rqp\" (UniqueName: \"kubernetes.io/projected/298ac2ef-6edb-40cb-bb92-8a8e039f333b-kube-api-access-22rqp\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.682993 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/298ac2ef-6edb-40cb-bb92-8a8e039f333b-ovn-controller-tls-certs\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.683116 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/298ac2ef-6edb-40cb-bb92-8a8e039f333b-combined-ca-bundle\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.683143 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/298ac2ef-6edb-40cb-bb92-8a8e039f333b-scripts\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.683177 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/298ac2ef-6edb-40cb-bb92-8a8e039f333b-var-run-ovn\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.683250 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/298ac2ef-6edb-40cb-bb92-8a8e039f333b-var-log-ovn\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.683270 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/298ac2ef-6edb-40cb-bb92-8a8e039f333b-var-run\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785089 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-var-log\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785188 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/298ac2ef-6edb-40cb-bb92-8a8e039f333b-var-log-ovn\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785214 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-var-run\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785236 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/298ac2ef-6edb-40cb-bb92-8a8e039f333b-var-run\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785262 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2xw\" (UniqueName: \"kubernetes.io/projected/a75df612-e3f4-4ea3-bfc8-daceaf59205d-kube-api-access-8c2xw\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785292 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-etc-ovs\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785394 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22rqp\" (UniqueName: \"kubernetes.io/projected/298ac2ef-6edb-40cb-bb92-8a8e039f333b-kube-api-access-22rqp\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785457 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a75df612-e3f4-4ea3-bfc8-daceaf59205d-scripts\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785483 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/298ac2ef-6edb-40cb-bb92-8a8e039f333b-ovn-controller-tls-certs\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785552 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-var-lib\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785585 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/298ac2ef-6edb-40cb-bb92-8a8e039f333b-combined-ca-bundle\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785605 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/298ac2ef-6edb-40cb-bb92-8a8e039f333b-scripts\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.785635 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/298ac2ef-6edb-40cb-bb92-8a8e039f333b-var-run-ovn\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.786211 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/298ac2ef-6edb-40cb-bb92-8a8e039f333b-var-run-ovn\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.786333 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/298ac2ef-6edb-40cb-bb92-8a8e039f333b-var-log-ovn\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.786438 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/298ac2ef-6edb-40cb-bb92-8a8e039f333b-var-run\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.791011 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/298ac2ef-6edb-40cb-bb92-8a8e039f333b-scripts\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.791469 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/298ac2ef-6edb-40cb-bb92-8a8e039f333b-combined-ca-bundle\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.794089 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/298ac2ef-6edb-40cb-bb92-8a8e039f333b-ovn-controller-tls-certs\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.806642 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22rqp\" (UniqueName: \"kubernetes.io/projected/298ac2ef-6edb-40cb-bb92-8a8e039f333b-kube-api-access-22rqp\") pod \"ovn-controller-l5h78\" (UID: \"298ac2ef-6edb-40cb-bb92-8a8e039f333b\") " pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.888016 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-var-lib\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.888437 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-var-lib\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.888824 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-var-log\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.888956 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-var-run\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.888993 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c2xw\" (UniqueName: \"kubernetes.io/projected/a75df612-e3f4-4ea3-bfc8-daceaf59205d-kube-api-access-8c2xw\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.889032 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-etc-ovs\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.889114 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a75df612-e3f4-4ea3-bfc8-daceaf59205d-scripts\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.889884 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-var-run\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.891033 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-var-log\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.891273 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a75df612-e3f4-4ea3-bfc8-daceaf59205d-etc-ovs\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.891517 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a75df612-e3f4-4ea3-bfc8-daceaf59205d-scripts\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.906308 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c2xw\" (UniqueName: \"kubernetes.io/projected/a75df612-e3f4-4ea3-bfc8-daceaf59205d-kube-api-access-8c2xw\") pod \"ovn-controller-ovs-gz9nz\" (UID: \"a75df612-e3f4-4ea3-bfc8-daceaf59205d\") " pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.938368 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78" Feb 02 13:21:45 crc kubenswrapper[4721]: I0202 13:21:45.967606 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.480107 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.483126 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.485082 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.486717 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.486858 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.486967 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-t4kxj" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.500722 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.557412 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e175d27-fe10-4fb7-9ce6-cb98379357cc-config\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.557638 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e175d27-fe10-4fb7-9ce6-cb98379357cc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.557717 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e175d27-fe10-4fb7-9ce6-cb98379357cc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.557891 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7fd03a9a-65ca-4f76-8f7b-ac794e369792\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fd03a9a-65ca-4f76-8f7b-ac794e369792\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.558034 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e175d27-fe10-4fb7-9ce6-cb98379357cc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.558172 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e175d27-fe10-4fb7-9ce6-cb98379357cc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.558207 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snjz9\" (UniqueName: \"kubernetes.io/projected/4e175d27-fe10-4fb7-9ce6-cb98379357cc-kube-api-access-snjz9\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.558284 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e175d27-fe10-4fb7-9ce6-cb98379357cc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.659705 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e175d27-fe10-4fb7-9ce6-cb98379357cc-config\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.659801 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e175d27-fe10-4fb7-9ce6-cb98379357cc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.659827 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e175d27-fe10-4fb7-9ce6-cb98379357cc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.659876 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7fd03a9a-65ca-4f76-8f7b-ac794e369792\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fd03a9a-65ca-4f76-8f7b-ac794e369792\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.659933 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e175d27-fe10-4fb7-9ce6-cb98379357cc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.659986 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e175d27-fe10-4fb7-9ce6-cb98379357cc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.660709 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e175d27-fe10-4fb7-9ce6-cb98379357cc-config\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.660855 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snjz9\" (UniqueName: \"kubernetes.io/projected/4e175d27-fe10-4fb7-9ce6-cb98379357cc-kube-api-access-snjz9\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.660904 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e175d27-fe10-4fb7-9ce6-cb98379357cc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.661354 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4e175d27-fe10-4fb7-9ce6-cb98379357cc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.662705 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e175d27-fe10-4fb7-9ce6-cb98379357cc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.665559 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e175d27-fe10-4fb7-9ce6-cb98379357cc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.668410 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e175d27-fe10-4fb7-9ce6-cb98379357cc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.669716 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e175d27-fe10-4fb7-9ce6-cb98379357cc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.691980 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.692028 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7fd03a9a-65ca-4f76-8f7b-ac794e369792\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fd03a9a-65ca-4f76-8f7b-ac794e369792\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0a56f1e81213f20d53ad52bdb2f1d153ff7361cb2b5b7b048fab2231a40ceec0/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.732197 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snjz9\" (UniqueName: \"kubernetes.io/projected/4e175d27-fe10-4fb7-9ce6-cb98379357cc-kube-api-access-snjz9\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.798925 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7fd03a9a-65ca-4f76-8f7b-ac794e369792\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fd03a9a-65ca-4f76-8f7b-ac794e369792\") pod \"ovsdbserver-sb-0\" (UID: \"4e175d27-fe10-4fb7-9ce6-cb98379357cc\") " pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:48 crc kubenswrapper[4721]: I0202 13:21:48.830529 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 13:21:51 crc kubenswrapper[4721]: I0202 13:21:51.767227 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Feb 02 13:21:52 crc kubenswrapper[4721]: E0202 13:21:52.404417 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 02 13:21:52 crc kubenswrapper[4721]: E0202 13:21:52.405398 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzdpg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-ppnbl_openstack(ded37b19-4830-4437-9a53-778f826f3582): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:21:52 crc kubenswrapper[4721]: E0202 13:21:52.407465 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" podUID="ded37b19-4830-4437-9a53-778f826f3582" Feb 02 13:21:52 crc kubenswrapper[4721]: W0202 13:21:52.429383 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d21d961_1540_4610_89c0_ee265f66d728.slice/crio-36afb1dca2e8a69b46d5d0b644f99e96709f045f97e67bf1482c430f4244c578 WatchSource:0}: Error finding container 36afb1dca2e8a69b46d5d0b644f99e96709f045f97e67bf1482c430f4244c578: Status 404 returned error can't find the container with id 36afb1dca2e8a69b46d5d0b644f99e96709f045f97e67bf1482c430f4244c578 Feb 02 13:21:52 crc kubenswrapper[4721]: E0202 13:21:52.531475 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 02 13:21:52 crc kubenswrapper[4721]: E0202 13:21:52.531703 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6pk9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-dx98r_openstack(b256785d-0ae0-454d-8927-a28668507e06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:21:52 crc kubenswrapper[4721]: E0202 13:21:52.534977 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" podUID="b256785d-0ae0-454d-8927-a28668507e06" Feb 02 13:21:52 crc kubenswrapper[4721]: I0202 13:21:52.808930 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4d21d961-1540-4610-89c0-ee265f66d728","Type":"ContainerStarted","Data":"36afb1dca2e8a69b46d5d0b644f99e96709f045f97e67bf1482c430f4244c578"} Feb 02 13:21:53 crc kubenswrapper[4721]: I0202 13:21:53.415697 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 13:21:53 crc kubenswrapper[4721]: W0202 13:21:53.424494 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda57cea33_806c_4028_b59f_9f5e65289eac.slice/crio-426dd2c7f5f6aedf336052b0eb98f8b62ecc6c202c3b2a5958bd85ca968a7a63 WatchSource:0}: Error finding container 426dd2c7f5f6aedf336052b0eb98f8b62ecc6c202c3b2a5958bd85ca968a7a63: Status 404 returned error can't find the container with id 426dd2c7f5f6aedf336052b0eb98f8b62ecc6c202c3b2a5958bd85ca968a7a63 Feb 02 13:21:53 crc kubenswrapper[4721]: I0202 13:21:53.846644 4721 generic.go:334] "Generic (PLEG): container finished" podID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" containerID="3dbaa0b96c9ac5ea0094bc68c273d9176a5f00da8a9565f16566f8f212818281" exitCode=0 Feb 02 13:21:53 crc kubenswrapper[4721]: I0202 13:21:53.846985 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" event={"ID":"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff","Type":"ContainerDied","Data":"3dbaa0b96c9ac5ea0094bc68c273d9176a5f00da8a9565f16566f8f212818281"} Feb 02 13:21:53 crc kubenswrapper[4721]: I0202 13:21:53.860037 4721 generic.go:334] "Generic (PLEG): container finished" podID="f9598cea-e831-47fd-aa1a-08060e23bba2" containerID="a20ee72f3c2584aa99632a64dbf1cb3b2b0b3ad83db068faf7bb964a8d3d6314" exitCode=0 Feb 02 13:21:53 crc kubenswrapper[4721]: I0202 13:21:53.860193 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" event={"ID":"f9598cea-e831-47fd-aa1a-08060e23bba2","Type":"ContainerDied","Data":"a20ee72f3c2584aa99632a64dbf1cb3b2b0b3ad83db068faf7bb964a8d3d6314"} Feb 02 13:21:53 crc kubenswrapper[4721]: I0202 13:21:53.865889 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a57cea33-806c-4028-b59f-9f5e65289eac","Type":"ContainerStarted","Data":"426dd2c7f5f6aedf336052b0eb98f8b62ecc6c202c3b2a5958bd85ca968a7a63"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.128773 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.202557 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.215004 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.230394 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.250606 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.254131 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b256785d-0ae0-454d-8927-a28668507e06-config\") pod \"b256785d-0ae0-454d-8927-a28668507e06\" (UID: \"b256785d-0ae0-454d-8927-a28668507e06\") " Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.254817 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b256785d-0ae0-454d-8927-a28668507e06-config" (OuterVolumeSpecName: "config") pod "b256785d-0ae0-454d-8927-a28668507e06" (UID: "b256785d-0ae0-454d-8927-a28668507e06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.255028 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pk9z\" (UniqueName: \"kubernetes.io/projected/b256785d-0ae0-454d-8927-a28668507e06-kube-api-access-6pk9z\") pod \"b256785d-0ae0-454d-8927-a28668507e06\" (UID: \"b256785d-0ae0-454d-8927-a28668507e06\") " Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.257413 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b256785d-0ae0-454d-8927-a28668507e06-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.259234 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.262846 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b256785d-0ae0-454d-8927-a28668507e06-kube-api-access-6pk9z" (OuterVolumeSpecName: "kube-api-access-6pk9z") pod "b256785d-0ae0-454d-8927-a28668507e06" (UID: "b256785d-0ae0-454d-8927-a28668507e06"). InnerVolumeSpecName "kube-api-access-6pk9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:21:54 crc kubenswrapper[4721]: W0202 13:21:54.262777 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod754114a2_a012_43fe_923b_a8cc3df91aa0.slice/crio-1ae72f1d8842f29156e47082662e67e943468c1763e7bc830d0036db2470a455 WatchSource:0}: Error finding container 1ae72f1d8842f29156e47082662e67e943468c1763e7bc830d0036db2470a455: Status 404 returned error can't find the container with id 1ae72f1d8842f29156e47082662e67e943468c1763e7bc830d0036db2470a455 Feb 02 13:21:54 crc kubenswrapper[4721]: W0202 13:21:54.278596 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dbd607_3fa8_48e0_b420_4e939a47c460.slice/crio-ce0a1e8e0cd036f62ef8c895b722638e8b437bbacf86893de06773f534d64f9d WatchSource:0}: Error finding container ce0a1e8e0cd036f62ef8c895b722638e8b437bbacf86893de06773f534d64f9d: Status 404 returned error can't find the container with id ce0a1e8e0cd036f62ef8c895b722638e8b437bbacf86893de06773f534d64f9d Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.279504 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc4f8f495-dxd8x"] Feb 02 13:21:54 crc kubenswrapper[4721]: W0202 13:21:54.286903 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda686ac60_f231_4070_98c7_7acbc66c29d5.slice/crio-944bebc705f70fcbcfaf0716e490e26611866c14af425ec38457dd857b00bcb4 WatchSource:0}: Error finding container 944bebc705f70fcbcfaf0716e490e26611866c14af425ec38457dd857b00bcb4: Status 404 returned error can't find the container with id 944bebc705f70fcbcfaf0716e490e26611866c14af425ec38457dd857b00bcb4 Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.288510 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 13:21:54 crc kubenswrapper[4721]: W0202 13:21:54.295415 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20201931_5a9c_4f86_ad4d_1df122372f8a.slice/crio-48a469129cff175d8479cfd49feb85c3e4574b8814b0a845b5cd5bc1d5df1849 WatchSource:0}: Error finding container 48a469129cff175d8479cfd49feb85c3e4574b8814b0a845b5cd5bc1d5df1849: Status 404 returned error can't find the container with id 48a469129cff175d8479cfd49feb85c3e4574b8814b0a845b5cd5bc1d5df1849 Feb 02 13:21:54 crc kubenswrapper[4721]: W0202 13:21:54.298512 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod496bb19e_217b_4896_9bee_8082ac5da28b.slice/crio-8df7987520f710454284227fffc5726ebb458e3dc37e41c9c6583e8675accf28 WatchSource:0}: Error finding container 8df7987520f710454284227fffc5726ebb458e3dc37e41c9c6583e8675accf28: Status 404 returned error can't find the container with id 8df7987520f710454284227fffc5726ebb458e3dc37e41c9c6583e8675accf28 Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.360072 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-dns-svc\") pod \"ded37b19-4830-4437-9a53-778f826f3582\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.360311 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzdpg\" (UniqueName: \"kubernetes.io/projected/ded37b19-4830-4437-9a53-778f826f3582-kube-api-access-hzdpg\") pod \"ded37b19-4830-4437-9a53-778f826f3582\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.360579 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-config\") pod \"ded37b19-4830-4437-9a53-778f826f3582\" (UID: \"ded37b19-4830-4437-9a53-778f826f3582\") " Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.360986 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ded37b19-4830-4437-9a53-778f826f3582" (UID: "ded37b19-4830-4437-9a53-778f826f3582"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.361160 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.361181 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pk9z\" (UniqueName: \"kubernetes.io/projected/b256785d-0ae0-454d-8927-a28668507e06-kube-api-access-6pk9z\") on node \"crc\" DevicePath \"\"" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.361949 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-config" (OuterVolumeSpecName: "config") pod "ded37b19-4830-4437-9a53-778f826f3582" (UID: "ded37b19-4830-4437-9a53-778f826f3582"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.366052 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded37b19-4830-4437-9a53-778f826f3582-kube-api-access-hzdpg" (OuterVolumeSpecName: "kube-api-access-hzdpg") pod "ded37b19-4830-4437-9a53-778f826f3582" (UID: "ded37b19-4830-4437-9a53-778f826f3582"). InnerVolumeSpecName "kube-api-access-hzdpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.463366 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded37b19-4830-4437-9a53-778f826f3582-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.463403 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzdpg\" (UniqueName: \"kubernetes.io/projected/ded37b19-4830-4437-9a53-778f826f3582-kube-api-access-hzdpg\") on node \"crc\" DevicePath \"\"" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.720970 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.796001 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.805238 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.817070 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l5h78"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.838154 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.881218 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerStarted","Data":"1ae72f1d8842f29156e47082662e67e943468c1763e7bc830d0036db2470a455"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.886163 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" event={"ID":"ded37b19-4830-4437-9a53-778f826f3582","Type":"ContainerDied","Data":"e4ecfe7e04cb7aa8d37b9511552603f7febbe5a304c908615695a3cf3f4a961f"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.886236 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ppnbl" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.891795 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" event={"ID":"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff","Type":"ContainerStarted","Data":"e39b433adf3c414a77139ad47d46e6db4a99ca06421ae738ff6dda51a5079476"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.891868 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.893054 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.894287 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc4f8f495-dxd8x" event={"ID":"20201931-5a9c-4f86-ad4d-1df122372f8a","Type":"ContainerStarted","Data":"ee31b8ae295838a63c0cbb3c51e04ccd55e0d107e0b3898e2c0f6e81f4671670"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.894324 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc4f8f495-dxd8x" event={"ID":"20201931-5a9c-4f86-ad4d-1df122372f8a","Type":"ContainerStarted","Data":"48a469129cff175d8479cfd49feb85c3e4574b8814b0a845b5cd5bc1d5df1849"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.896149 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2505bd6b-64d4-4d17-9c1a-0e89562612be","Type":"ContainerStarted","Data":"59039b072cf0feef66db7d58bb9d13813151414ad72df9f948c8abd862ab710b"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.898785 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"b6dbd607-3fa8-48e0-b420-4e939a47c460","Type":"ContainerStarted","Data":"ce0a1e8e0cd036f62ef8c895b722638e8b437bbacf86893de06773f534d64f9d"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.901046 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"496bb19e-217b-4896-9bee-8082ac5da28b","Type":"ContainerStarted","Data":"8df7987520f710454284227fffc5726ebb458e3dc37e41c9c6583e8675accf28"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.902628 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" event={"ID":"b256785d-0ae0-454d-8927-a28668507e06","Type":"ContainerDied","Data":"ffe8ae8e550e5fe561d651b70087eeacc3e6b311c6a4ad871733c9d878db06b3"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.902671 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-dx98r" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.907036 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a686ac60-f231-4070-98c7-7acbc66c29d5","Type":"ContainerStarted","Data":"944bebc705f70fcbcfaf0716e490e26611866c14af425ec38457dd857b00bcb4"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.910317 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" event={"ID":"f9598cea-e831-47fd-aa1a-08060e23bba2","Type":"ContainerStarted","Data":"79ee8e8028b6aa26c38e11ef05e770e5dd8bbf55490a89dcd0603734b8ebe97e"} Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.910479 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.910410 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" podStartSLOduration=3.61007286 podStartE2EDuration="20.910397174s" podCreationTimestamp="2026-02-02 13:21:34 +0000 UTC" firstStartedPulling="2026-02-02 13:21:35.326731728 +0000 UTC m=+1235.629246107" lastFinishedPulling="2026-02-02 13:21:52.627056042 +0000 UTC m=+1252.929570421" observedRunningTime="2026-02-02 13:21:54.90617775 +0000 UTC m=+1255.208692139" watchObservedRunningTime="2026-02-02 13:21:54.910397174 +0000 UTC m=+1255.212911583" Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.945723 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppnbl"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.954856 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ppnbl"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.979111 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dx98r"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.992679 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-dx98r"] Feb 02 13:21:54 crc kubenswrapper[4721]: I0202 13:21:54.996965 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cc4f8f495-dxd8x" podStartSLOduration=12.996943364 podStartE2EDuration="12.996943364s" podCreationTimestamp="2026-02-02 13:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:21:54.983405948 +0000 UTC m=+1255.285920337" watchObservedRunningTime="2026-02-02 13:21:54.996943364 +0000 UTC m=+1255.299457763" Feb 02 13:21:55 crc kubenswrapper[4721]: I0202 13:21:55.016658 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" podStartSLOduration=8.476555862 podStartE2EDuration="21.016637517s" podCreationTimestamp="2026-02-02 13:21:34 +0000 UTC" firstStartedPulling="2026-02-02 13:21:40.090809531 +0000 UTC m=+1240.393323930" lastFinishedPulling="2026-02-02 13:21:52.630891196 +0000 UTC m=+1252.933405585" observedRunningTime="2026-02-02 13:21:55.005531777 +0000 UTC m=+1255.308046166" watchObservedRunningTime="2026-02-02 13:21:55.016637517 +0000 UTC m=+1255.319151906" Feb 02 13:21:55 crc kubenswrapper[4721]: I0202 13:21:55.063216 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gz9nz"] Feb 02 13:21:56 crc kubenswrapper[4721]: I0202 13:21:56.425353 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b256785d-0ae0-454d-8927-a28668507e06" path="/var/lib/kubelet/pods/b256785d-0ae0-454d-8927-a28668507e06/volumes" Feb 02 13:21:56 crc kubenswrapper[4721]: I0202 13:21:56.426147 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded37b19-4830-4437-9a53-778f826f3582" path="/var/lib/kubelet/pods/ded37b19-4830-4437-9a53-778f826f3582/volumes" Feb 02 13:21:56 crc kubenswrapper[4721]: W0202 13:21:56.872309 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e175d27_fe10_4fb7_9ce6_cb98379357cc.slice/crio-d00d9be33b48072beaaaab4076feea10fe8f0fdbb463271458ca986ec5edd095 WatchSource:0}: Error finding container d00d9be33b48072beaaaab4076feea10fe8f0fdbb463271458ca986ec5edd095: Status 404 returned error can't find the container with id d00d9be33b48072beaaaab4076feea10fe8f0fdbb463271458ca986ec5edd095 Feb 02 13:21:56 crc kubenswrapper[4721]: I0202 13:21:56.931655 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l5h78" event={"ID":"298ac2ef-6edb-40cb-bb92-8a8e039f333b","Type":"ContainerStarted","Data":"13e13d77bd07f43bdb5f2bddb70e54f67e8098f6d29d70c860b3d45acca04514"} Feb 02 13:21:56 crc kubenswrapper[4721]: I0202 13:21:56.934041 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gz9nz" event={"ID":"a75df612-e3f4-4ea3-bfc8-daceaf59205d","Type":"ContainerStarted","Data":"d326d84b5a85f3af2e8c53c3289b853de962bf85378968674fdf9ea42a704488"} Feb 02 13:21:56 crc kubenswrapper[4721]: I0202 13:21:56.935606 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"080bfc29-50bc-4ba1-b097-4f5c54586d8c","Type":"ContainerStarted","Data":"01ede81af8b0a6f17de5660f45222af06e54bd3e55bcadb3c130698b8e10a8b0"} Feb 02 13:21:56 crc kubenswrapper[4721]: I0202 13:21:56.936782 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" event={"ID":"6064a9a4-2316-4bdd-abf1-934e9167528a","Type":"ContainerStarted","Data":"2843497cad5bf287b6c581a67c84e3e6decf3e64aad739945932e5a281ee37ef"} Feb 02 13:21:56 crc kubenswrapper[4721]: I0202 13:21:56.938399 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4","Type":"ContainerStarted","Data":"0d714d60036ede65d525b7a71c6621f278f9400401ce8e0f30061a5e3a3f3ac2"} Feb 02 13:21:56 crc kubenswrapper[4721]: I0202 13:21:56.939603 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc071000-a602-4de6-a9bc-1c93b6d58c25","Type":"ContainerStarted","Data":"f06dd476cbb6c5a0d98d77cc9568acefbefff19f14d266eab26513942d1c3774"} Feb 02 13:21:56 crc kubenswrapper[4721]: I0202 13:21:56.940633 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4e175d27-fe10-4fb7-9ce6-cb98379357cc","Type":"ContainerStarted","Data":"d00d9be33b48072beaaaab4076feea10fe8f0fdbb463271458ca986ec5edd095"} Feb 02 13:21:59 crc kubenswrapper[4721]: I0202 13:21:59.671012 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:21:59 crc kubenswrapper[4721]: I0202 13:21:59.995280 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"496bb19e-217b-4896-9bee-8082ac5da28b","Type":"ContainerStarted","Data":"77992fe4aaf2cf252e3f8b5179aa81c542c6cc143ebeb5bc6250cbc35937ab79"} Feb 02 13:22:00 crc kubenswrapper[4721]: I0202 13:22:00.002115 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"b6dbd607-3fa8-48e0-b420-4e939a47c460","Type":"ContainerStarted","Data":"ee3f7790a31c71e90ec0266b207ae1db8fc80457afd04cb12988c018efe7f723"} Feb 02 13:22:00 crc kubenswrapper[4721]: I0202 13:22:00.006373 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a57cea33-806c-4028-b59f-9f5e65289eac","Type":"ContainerStarted","Data":"c824f81205ecd08e0b369ac8397beb509a4ac36dc83f526d99dfec02aa78a3a3"} Feb 02 13:22:00 crc kubenswrapper[4721]: I0202 13:22:00.010229 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4d21d961-1540-4610-89c0-ee265f66d728","Type":"ContainerStarted","Data":"562fa5c00d88c0f5830431f80588f0c092e7efe8f3354457564b72a3bf152ac5"} Feb 02 13:22:00 crc kubenswrapper[4721]: I0202 13:22:00.150182 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:22:00 crc kubenswrapper[4721]: I0202 13:22:00.246592 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h9xj9"] Feb 02 13:22:00 crc kubenswrapper[4721]: I0202 13:22:00.246791 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" podUID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" containerName="dnsmasq-dns" containerID="cri-o://e39b433adf3c414a77139ad47d46e6db4a99ca06421ae738ff6dda51a5079476" gracePeriod=10 Feb 02 13:22:01 crc kubenswrapper[4721]: I0202 13:22:01.021284 4721 generic.go:334] "Generic (PLEG): container finished" podID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" containerID="e39b433adf3c414a77139ad47d46e6db4a99ca06421ae738ff6dda51a5079476" exitCode=0 Feb 02 13:22:01 crc kubenswrapper[4721]: I0202 13:22:01.021419 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" event={"ID":"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff","Type":"ContainerDied","Data":"e39b433adf3c414a77139ad47d46e6db4a99ca06421ae738ff6dda51a5079476"} Feb 02 13:22:02 crc kubenswrapper[4721]: I0202 13:22:02.768461 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:22:02 crc kubenswrapper[4721]: I0202 13:22:02.768748 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:22:02 crc kubenswrapper[4721]: I0202 13:22:02.773321 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:22:03 crc kubenswrapper[4721]: I0202 13:22:03.042170 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cc4f8f495-dxd8x" Feb 02 13:22:03 crc kubenswrapper[4721]: I0202 13:22:03.109748 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-679d56c757-8hcnt"] Feb 02 13:22:04 crc kubenswrapper[4721]: I0202 13:22:04.670542 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" podUID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.441555 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.606175 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp8wm\" (UniqueName: \"kubernetes.io/projected/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-kube-api-access-gp8wm\") pod \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.606268 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-dns-svc\") pod \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.606406 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-config\") pod \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\" (UID: \"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff\") " Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.609449 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-kube-api-access-gp8wm" (OuterVolumeSpecName: "kube-api-access-gp8wm") pod "242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" (UID: "242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff"). InnerVolumeSpecName "kube-api-access-gp8wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.658418 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" (UID: "242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.658449 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-config" (OuterVolumeSpecName: "config") pod "242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" (UID: "242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.711937 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp8wm\" (UniqueName: \"kubernetes.io/projected/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-kube-api-access-gp8wm\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.711969 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:07 crc kubenswrapper[4721]: I0202 13:22:07.711979 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.095659 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" event={"ID":"242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff","Type":"ContainerDied","Data":"f45d2f8df0c32334c973de797a13b02d6fc09764b2e977590919ddc80c08dc8f"} Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.096024 4721 scope.go:117] "RemoveContainer" containerID="e39b433adf3c414a77139ad47d46e6db4a99ca06421ae738ff6dda51a5079476" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.095711 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-h9xj9" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.140801 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h9xj9"] Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.151060 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h9xj9"] Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.316553 4721 scope.go:117] "RemoveContainer" containerID="3dbaa0b96c9ac5ea0094bc68c273d9176a5f00da8a9565f16566f8f212818281" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.426284 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" path="/var/lib/kubelet/pods/242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff/volumes" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.864502 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hkwkv"] Feb 02 13:22:08 crc kubenswrapper[4721]: E0202 13:22:08.865278 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" containerName="init" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.865294 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" containerName="init" Feb 02 13:22:08 crc kubenswrapper[4721]: E0202 13:22:08.865336 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" containerName="dnsmasq-dns" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.865343 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" containerName="dnsmasq-dns" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.865560 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="242f2c9f-2150-4d1a-8c40-9e34f1ffc5ff" containerName="dnsmasq-dns" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.866769 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.869161 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.896473 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hkwkv"] Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.947110 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753a63ae-970e-4dd1-a284-bc3b6027ca64-combined-ca-bundle\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.947177 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/753a63ae-970e-4dd1-a284-bc3b6027ca64-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.947212 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/753a63ae-970e-4dd1-a284-bc3b6027ca64-ovs-rundir\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.947234 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/753a63ae-970e-4dd1-a284-bc3b6027ca64-ovn-rundir\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.947268 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdlw\" (UniqueName: \"kubernetes.io/projected/753a63ae-970e-4dd1-a284-bc3b6027ca64-kube-api-access-4tdlw\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:08 crc kubenswrapper[4721]: I0202 13:22:08.947391 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753a63ae-970e-4dd1-a284-bc3b6027ca64-config\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.049954 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753a63ae-970e-4dd1-a284-bc3b6027ca64-combined-ca-bundle\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.050349 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/753a63ae-970e-4dd1-a284-bc3b6027ca64-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.050393 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/753a63ae-970e-4dd1-a284-bc3b6027ca64-ovs-rundir\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.050419 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/753a63ae-970e-4dd1-a284-bc3b6027ca64-ovn-rundir\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.050465 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdlw\" (UniqueName: \"kubernetes.io/projected/753a63ae-970e-4dd1-a284-bc3b6027ca64-kube-api-access-4tdlw\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.050550 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753a63ae-970e-4dd1-a284-bc3b6027ca64-config\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.050698 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/753a63ae-970e-4dd1-a284-bc3b6027ca64-ovs-rundir\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.050779 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/753a63ae-970e-4dd1-a284-bc3b6027ca64-ovn-rundir\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.051420 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753a63ae-970e-4dd1-a284-bc3b6027ca64-config\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.056161 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753a63ae-970e-4dd1-a284-bc3b6027ca64-combined-ca-bundle\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.056720 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/753a63ae-970e-4dd1-a284-bc3b6027ca64-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.071206 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdlw\" (UniqueName: \"kubernetes.io/projected/753a63ae-970e-4dd1-a284-bc3b6027ca64-kube-api-access-4tdlw\") pod \"ovn-controller-metrics-hkwkv\" (UID: \"753a63ae-970e-4dd1-a284-bc3b6027ca64\") " pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.101596 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8dm9r"] Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.103373 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.105989 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.109158 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gz9nz" event={"ID":"a75df612-e3f4-4ea3-bfc8-daceaf59205d","Type":"ContainerStarted","Data":"14cb944d9ebd6f321f4e7337871523268072f36cf383a9853996146747d08a5b"} Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.112625 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" event={"ID":"6064a9a4-2316-4bdd-abf1-934e9167528a","Type":"ContainerStarted","Data":"db48bb505d8316d1ad58e3c2189e606c5e7d3f9c21ce8628f76ea85096ff65d4"} Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.113886 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4","Type":"ContainerStarted","Data":"3f181baca9c046b0491dc6e8352b3a4da843071fc3c26260b6a24a6eceb14a93"} Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.115177 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4e175d27-fe10-4fb7-9ce6-cb98379357cc","Type":"ContainerStarted","Data":"320908b034f5fe19814c7c0745afd4b6011c3f3fc46fa9e30c994d6b7cc2fbf0"} Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.117456 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8dm9r"] Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.152004 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l5h78" event={"ID":"298ac2ef-6edb-40cb-bb92-8a8e039f333b","Type":"ContainerStarted","Data":"363dd78e1c4e5867cf298bb86c64e218deba7be60b4ca20bfbf5998146e7116d"} Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.153779 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-l5h78" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.161654 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"080bfc29-50bc-4ba1-b097-4f5c54586d8c","Type":"ContainerStarted","Data":"19b4027e8736b74bf49b5b89d775dabe3e8a79fa82676e1e337d9a187eb4727b"} Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.178382 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a686ac60-f231-4070-98c7-7acbc66c29d5","Type":"ContainerStarted","Data":"bb793056172a633150843ba507e9aeef9f6e37d73e63bc5b790e41225aa4edfb"} Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.179325 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.189379 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hkwkv" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.190636 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2505bd6b-64d4-4d17-9c1a-0e89562612be","Type":"ContainerStarted","Data":"d12641d1726ea2160227d17b476c80e4ad8727c8ef150b1b1b2de53e6dfe063d"} Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.196311 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc071000-a602-4de6-a9bc-1c93b6d58c25","Type":"ContainerStarted","Data":"28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3"} Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.197166 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.220710 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-6lvhx" podStartSLOduration=16.627084138 podStartE2EDuration="27.220683109s" podCreationTimestamp="2026-02-02 13:21:42 +0000 UTC" firstStartedPulling="2026-02-02 13:21:56.893462668 +0000 UTC m=+1257.195977057" lastFinishedPulling="2026-02-02 13:22:07.487061639 +0000 UTC m=+1267.789576028" observedRunningTime="2026-02-02 13:22:09.214845761 +0000 UTC m=+1269.517360150" watchObservedRunningTime="2026-02-02 13:22:09.220683109 +0000 UTC m=+1269.523197518" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.256880 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.257051 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.257125 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6mw6\" (UniqueName: \"kubernetes.io/projected/ab22a3c1-704b-4bc7-81cb-623e429e619a-kube-api-access-c6mw6\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.257193 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-config\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.258419 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.668202495 podStartE2EDuration="28.258398219s" podCreationTimestamp="2026-02-02 13:21:41 +0000 UTC" firstStartedPulling="2026-02-02 13:21:56.890630082 +0000 UTC m=+1257.193144471" lastFinishedPulling="2026-02-02 13:22:08.480825806 +0000 UTC m=+1268.783340195" observedRunningTime="2026-02-02 13:22:09.243192848 +0000 UTC m=+1269.545707237" watchObservedRunningTime="2026-02-02 13:22:09.258398219 +0000 UTC m=+1269.560912608" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.318859 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-l5h78" podStartSLOduration=13.253537779 podStartE2EDuration="24.318834672s" podCreationTimestamp="2026-02-02 13:21:45 +0000 UTC" firstStartedPulling="2026-02-02 13:21:56.890328934 +0000 UTC m=+1257.192843323" lastFinishedPulling="2026-02-02 13:22:07.955625827 +0000 UTC m=+1268.258140216" observedRunningTime="2026-02-02 13:22:09.266685853 +0000 UTC m=+1269.569200262" watchObservedRunningTime="2026-02-02 13:22:09.318834672 +0000 UTC m=+1269.621349061" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.363349 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.363580 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.363606 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6mw6\" (UniqueName: \"kubernetes.io/projected/ab22a3c1-704b-4bc7-81cb-623e429e619a-kube-api-access-c6mw6\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.363642 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-config\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.364914 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.378263 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-config\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.380505 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.182385403 podStartE2EDuration="31.380481559s" podCreationTimestamp="2026-02-02 13:21:38 +0000 UTC" firstStartedPulling="2026-02-02 13:21:54.289070606 +0000 UTC m=+1254.591584995" lastFinishedPulling="2026-02-02 13:22:07.487166762 +0000 UTC m=+1267.789681151" observedRunningTime="2026-02-02 13:22:09.334960218 +0000 UTC m=+1269.637474617" watchObservedRunningTime="2026-02-02 13:22:09.380481559 +0000 UTC m=+1269.682995948" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.383285 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.427430 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6mw6\" (UniqueName: \"kubernetes.io/projected/ab22a3c1-704b-4bc7-81cb-623e429e619a-kube-api-access-c6mw6\") pod \"dnsmasq-dns-7fd796d7df-8dm9r\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.462539 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.473391 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8dm9r"] Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.490360 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9n8f"] Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.492778 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.497023 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.512721 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9n8f"] Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.577285 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-config\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.577343 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.577382 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.577425 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.577459 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj6v4\" (UniqueName: \"kubernetes.io/projected/02b496f0-c99d-43e9-9e8a-03286d8966ab-kube-api-access-dj6v4\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.679006 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-config\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.679413 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.679473 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.679538 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.679600 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj6v4\" (UniqueName: \"kubernetes.io/projected/02b496f0-c99d-43e9-9e8a-03286d8966ab-kube-api-access-dj6v4\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.680379 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-config\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.680799 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.681221 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.682845 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.720770 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj6v4\" (UniqueName: \"kubernetes.io/projected/02b496f0-c99d-43e9-9e8a-03286d8966ab-kube-api-access-dj6v4\") pod \"dnsmasq-dns-86db49b7ff-x9n8f\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.852675 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hkwkv"] Feb 02 13:22:09 crc kubenswrapper[4721]: I0202 13:22:09.915831 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:10 crc kubenswrapper[4721]: I0202 13:22:10.121805 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8dm9r"] Feb 02 13:22:10 crc kubenswrapper[4721]: I0202 13:22:10.247710 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hkwkv" event={"ID":"753a63ae-970e-4dd1-a284-bc3b6027ca64","Type":"ContainerStarted","Data":"bfdc667102bd1d472804c474addc8c1f40e6e365d9c067acd507ccd135592a72"} Feb 02 13:22:10 crc kubenswrapper[4721]: I0202 13:22:10.250861 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" event={"ID":"ab22a3c1-704b-4bc7-81cb-623e429e619a","Type":"ContainerStarted","Data":"78c99d5eb08266bee26c3f183a5913dabc7aa4cac16d4baa27beddfa99913262"} Feb 02 13:22:10 crc kubenswrapper[4721]: I0202 13:22:10.257186 4721 generic.go:334] "Generic (PLEG): container finished" podID="a75df612-e3f4-4ea3-bfc8-daceaf59205d" containerID="14cb944d9ebd6f321f4e7337871523268072f36cf383a9853996146747d08a5b" exitCode=0 Feb 02 13:22:10 crc kubenswrapper[4721]: I0202 13:22:10.257480 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gz9nz" event={"ID":"a75df612-e3f4-4ea3-bfc8-daceaf59205d","Type":"ContainerDied","Data":"14cb944d9ebd6f321f4e7337871523268072f36cf383a9853996146747d08a5b"} Feb 02 13:22:10 crc kubenswrapper[4721]: I0202 13:22:10.548624 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9n8f"] Feb 02 13:22:10 crc kubenswrapper[4721]: W0202 13:22:10.570991 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02b496f0_c99d_43e9_9e8a_03286d8966ab.slice/crio-d756b18dd24d6e38b37f9d5682fc7973ac79057046e2526e8afd7ae562cccd1e WatchSource:0}: Error finding container d756b18dd24d6e38b37f9d5682fc7973ac79057046e2526e8afd7ae562cccd1e: Status 404 returned error can't find the container with id d756b18dd24d6e38b37f9d5682fc7973ac79057046e2526e8afd7ae562cccd1e Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.270164 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" event={"ID":"02b496f0-c99d-43e9-9e8a-03286d8966ab","Type":"ContainerStarted","Data":"d756b18dd24d6e38b37f9d5682fc7973ac79057046e2526e8afd7ae562cccd1e"} Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.272919 4721 generic.go:334] "Generic (PLEG): container finished" podID="ab22a3c1-704b-4bc7-81cb-623e429e619a" containerID="7462ae5c027f73fc404a31a58cb2b894d96a3e39c3ff01b6eaafb7a0a288d5dc" exitCode=0 Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.273015 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" event={"ID":"ab22a3c1-704b-4bc7-81cb-623e429e619a","Type":"ContainerDied","Data":"7462ae5c027f73fc404a31a58cb2b894d96a3e39c3ff01b6eaafb7a0a288d5dc"} Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.277306 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gz9nz" event={"ID":"a75df612-e3f4-4ea3-bfc8-daceaf59205d","Type":"ContainerStarted","Data":"c668936653f99869b2a3b3bceecad194fa33cfecf37308e54182c33305d0440c"} Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.689891 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.859646 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-config\") pod \"ab22a3c1-704b-4bc7-81cb-623e429e619a\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.859794 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-dns-svc\") pod \"ab22a3c1-704b-4bc7-81cb-623e429e619a\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.859876 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6mw6\" (UniqueName: \"kubernetes.io/projected/ab22a3c1-704b-4bc7-81cb-623e429e619a-kube-api-access-c6mw6\") pod \"ab22a3c1-704b-4bc7-81cb-623e429e619a\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.859907 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-ovsdbserver-nb\") pod \"ab22a3c1-704b-4bc7-81cb-623e429e619a\" (UID: \"ab22a3c1-704b-4bc7-81cb-623e429e619a\") " Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.863934 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab22a3c1-704b-4bc7-81cb-623e429e619a-kube-api-access-c6mw6" (OuterVolumeSpecName: "kube-api-access-c6mw6") pod "ab22a3c1-704b-4bc7-81cb-623e429e619a" (UID: "ab22a3c1-704b-4bc7-81cb-623e429e619a"). InnerVolumeSpecName "kube-api-access-c6mw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.881244 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab22a3c1-704b-4bc7-81cb-623e429e619a" (UID: "ab22a3c1-704b-4bc7-81cb-623e429e619a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.884997 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ab22a3c1-704b-4bc7-81cb-623e429e619a" (UID: "ab22a3c1-704b-4bc7-81cb-623e429e619a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.886057 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-config" (OuterVolumeSpecName: "config") pod "ab22a3c1-704b-4bc7-81cb-623e429e619a" (UID: "ab22a3c1-704b-4bc7-81cb-623e429e619a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.962801 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.962857 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6mw6\" (UniqueName: \"kubernetes.io/projected/ab22a3c1-704b-4bc7-81cb-623e429e619a-kube-api-access-c6mw6\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.962874 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:11 crc kubenswrapper[4721]: I0202 13:22:11.962887 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab22a3c1-704b-4bc7-81cb-623e429e619a-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.288136 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4e175d27-fe10-4fb7-9ce6-cb98379357cc","Type":"ContainerStarted","Data":"9bc96cfe0b5d7259d1047b30aa115867fe936695dc7f12460c53b3e4556e2705"} Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.290561 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.290540 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-8dm9r" event={"ID":"ab22a3c1-704b-4bc7-81cb-623e429e619a","Type":"ContainerDied","Data":"78c99d5eb08266bee26c3f183a5913dabc7aa4cac16d4baa27beddfa99913262"} Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.290699 4721 scope.go:117] "RemoveContainer" containerID="7462ae5c027f73fc404a31a58cb2b894d96a3e39c3ff01b6eaafb7a0a288d5dc" Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.293604 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gz9nz" event={"ID":"a75df612-e3f4-4ea3-bfc8-daceaf59205d","Type":"ContainerStarted","Data":"a880a4453b088ae93b60e019547efc4b43fba24594a2f483ba14d172b0f87fb8"} Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.293742 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.295829 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"080bfc29-50bc-4ba1-b097-4f5c54586d8c","Type":"ContainerStarted","Data":"788238e13b3a7d4c500208e838cbd379811c90ee8f0d3f7659036aaa10ee1e55"} Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.297266 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hkwkv" event={"ID":"753a63ae-970e-4dd1-a284-bc3b6027ca64","Type":"ContainerStarted","Data":"35547630b2e0f8b8d2c1eeefe1581ef2ec7f24a6564a1f148ec46816fc888253"} Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.299320 4721 generic.go:334] "Generic (PLEG): container finished" podID="02b496f0-c99d-43e9-9e8a-03286d8966ab" containerID="d0477d4beedf8835ceccc8981c1de2a9fe8aa3519682eb80e7972c0762297343" exitCode=0 Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.299378 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" event={"ID":"02b496f0-c99d-43e9-9e8a-03286d8966ab","Type":"ContainerDied","Data":"d0477d4beedf8835ceccc8981c1de2a9fe8aa3519682eb80e7972c0762297343"} Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.301857 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerStarted","Data":"f51717e20215c350c32f4c0b374e223fc4a90aedd712f943c01301452ac10dc6"} Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.316823 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.750053739 podStartE2EDuration="25.316801206s" podCreationTimestamp="2026-02-02 13:21:47 +0000 UTC" firstStartedPulling="2026-02-02 13:21:56.890835498 +0000 UTC m=+1257.193349887" lastFinishedPulling="2026-02-02 13:22:11.457582965 +0000 UTC m=+1271.760097354" observedRunningTime="2026-02-02 13:22:12.311929944 +0000 UTC m=+1272.614444333" watchObservedRunningTime="2026-02-02 13:22:12.316801206 +0000 UTC m=+1272.619315595" Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.343989 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hkwkv" podStartSLOduration=2.74350902 podStartE2EDuration="4.34396725s" podCreationTimestamp="2026-02-02 13:22:08 +0000 UTC" firstStartedPulling="2026-02-02 13:22:09.885233715 +0000 UTC m=+1270.187748104" lastFinishedPulling="2026-02-02 13:22:11.485691935 +0000 UTC m=+1271.788206334" observedRunningTime="2026-02-02 13:22:12.331566145 +0000 UTC m=+1272.634080544" watchObservedRunningTime="2026-02-02 13:22:12.34396725 +0000 UTC m=+1272.646481649" Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.357673 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gz9nz" podStartSLOduration=16.608507734 podStartE2EDuration="27.3576538s" podCreationTimestamp="2026-02-02 13:21:45 +0000 UTC" firstStartedPulling="2026-02-02 13:21:56.894147197 +0000 UTC m=+1257.196661586" lastFinishedPulling="2026-02-02 13:22:07.643293263 +0000 UTC m=+1267.945807652" observedRunningTime="2026-02-02 13:22:12.355757539 +0000 UTC m=+1272.658271938" watchObservedRunningTime="2026-02-02 13:22:12.3576538 +0000 UTC m=+1272.660168189" Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.475549 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.908973605 podStartE2EDuration="30.475518526s" podCreationTimestamp="2026-02-02 13:21:42 +0000 UTC" firstStartedPulling="2026-02-02 13:21:56.891036074 +0000 UTC m=+1257.193550463" lastFinishedPulling="2026-02-02 13:22:11.457580995 +0000 UTC m=+1271.760095384" observedRunningTime="2026-02-02 13:22:12.421846325 +0000 UTC m=+1272.724360744" watchObservedRunningTime="2026-02-02 13:22:12.475518526 +0000 UTC m=+1272.778032915" Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.518158 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8dm9r"] Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.529199 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-8dm9r"] Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.830943 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 02 13:22:12 crc kubenswrapper[4721]: I0202 13:22:12.878400 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 02 13:22:13 crc kubenswrapper[4721]: I0202 13:22:13.314409 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" event={"ID":"02b496f0-c99d-43e9-9e8a-03286d8966ab","Type":"ContainerStarted","Data":"6404bb6f1951c91af9f8453984cfaf480f6307bbda26d7f2b85d0b4cee4e2109"} Feb 02 13:22:13 crc kubenswrapper[4721]: I0202 13:22:13.316103 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:22:13 crc kubenswrapper[4721]: I0202 13:22:13.316149 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 02 13:22:13 crc kubenswrapper[4721]: I0202 13:22:13.343469 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" podStartSLOduration=4.343446352 podStartE2EDuration="4.343446352s" podCreationTimestamp="2026-02-02 13:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:22:13.337486281 +0000 UTC m=+1273.640000680" watchObservedRunningTime="2026-02-02 13:22:13.343446352 +0000 UTC m=+1273.645960751" Feb 02 13:22:13 crc kubenswrapper[4721]: I0202 13:22:13.369083 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.101775 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.102207 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.141634 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.257789 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.324111 4721 generic.go:334] "Generic (PLEG): container finished" podID="2505bd6b-64d4-4d17-9c1a-0e89562612be" containerID="d12641d1726ea2160227d17b476c80e4ad8727c8ef150b1b1b2de53e6dfe063d" exitCode=0 Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.324168 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2505bd6b-64d4-4d17-9c1a-0e89562612be","Type":"ContainerDied","Data":"d12641d1726ea2160227d17b476c80e4ad8727c8ef150b1b1b2de53e6dfe063d"} Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.326034 4721 generic.go:334] "Generic (PLEG): container finished" podID="f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4" containerID="3f181baca9c046b0491dc6e8352b3a4da843071fc3c26260b6a24a6eceb14a93" exitCode=0 Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.326114 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4","Type":"ContainerDied","Data":"3f181baca9c046b0491dc6e8352b3a4da843071fc3c26260b6a24a6eceb14a93"} Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.326631 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.390214 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.454537 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab22a3c1-704b-4bc7-81cb-623e429e619a" path="/var/lib/kubelet/pods/ab22a3c1-704b-4bc7-81cb-623e429e619a/volumes" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.631763 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 02 13:22:14 crc kubenswrapper[4721]: E0202 13:22:14.632358 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab22a3c1-704b-4bc7-81cb-623e429e619a" containerName="init" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.632380 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab22a3c1-704b-4bc7-81cb-623e429e619a" containerName="init" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.632648 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab22a3c1-704b-4bc7-81cb-623e429e619a" containerName="init" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.634204 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.644498 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.645146 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.645292 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.645449 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-96xv8" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.682445 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.748802 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5938c1-e4b9-4437-a379-c25bc5b1c243-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.749237 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd5938c1-e4b9-4437-a379-c25bc5b1c243-scripts\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.749270 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkkqr\" (UniqueName: \"kubernetes.io/projected/cd5938c1-e4b9-4437-a379-c25bc5b1c243-kube-api-access-fkkqr\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.749483 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5938c1-e4b9-4437-a379-c25bc5b1c243-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.749640 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd5938c1-e4b9-4437-a379-c25bc5b1c243-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.749913 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5938c1-e4b9-4437-a379-c25bc5b1c243-config\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.749958 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5938c1-e4b9-4437-a379-c25bc5b1c243-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.763649 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.763721 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.763772 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.764649 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56e02e958f304734b98b90c0b35547a7aaeb3ba27ad6cd35ef754f549abd2513"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.764721 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://56e02e958f304734b98b90c0b35547a7aaeb3ba27ad6cd35ef754f549abd2513" gracePeriod=600 Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.852431 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5938c1-e4b9-4437-a379-c25bc5b1c243-config\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.852478 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5938c1-e4b9-4437-a379-c25bc5b1c243-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.852528 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5938c1-e4b9-4437-a379-c25bc5b1c243-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.852565 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd5938c1-e4b9-4437-a379-c25bc5b1c243-scripts\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.852586 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkkqr\" (UniqueName: \"kubernetes.io/projected/cd5938c1-e4b9-4437-a379-c25bc5b1c243-kube-api-access-fkkqr\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.852655 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5938c1-e4b9-4437-a379-c25bc5b1c243-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.852705 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd5938c1-e4b9-4437-a379-c25bc5b1c243-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.853260 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd5938c1-e4b9-4437-a379-c25bc5b1c243-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.853689 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd5938c1-e4b9-4437-a379-c25bc5b1c243-config\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.853771 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd5938c1-e4b9-4437-a379-c25bc5b1c243-scripts\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.857147 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5938c1-e4b9-4437-a379-c25bc5b1c243-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.857227 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd5938c1-e4b9-4437-a379-c25bc5b1c243-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.863805 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd5938c1-e4b9-4437-a379-c25bc5b1c243-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.881901 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkkqr\" (UniqueName: \"kubernetes.io/projected/cd5938c1-e4b9-4437-a379-c25bc5b1c243-kube-api-access-fkkqr\") pod \"ovn-northd-0\" (UID: \"cd5938c1-e4b9-4437-a379-c25bc5b1c243\") " pod="openstack/ovn-northd-0" Feb 02 13:22:14 crc kubenswrapper[4721]: I0202 13:22:14.978741 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 13:22:15 crc kubenswrapper[4721]: I0202 13:22:15.342817 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="56e02e958f304734b98b90c0b35547a7aaeb3ba27ad6cd35ef754f549abd2513" exitCode=0 Feb 02 13:22:15 crc kubenswrapper[4721]: I0202 13:22:15.342864 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"56e02e958f304734b98b90c0b35547a7aaeb3ba27ad6cd35ef754f549abd2513"} Feb 02 13:22:15 crc kubenswrapper[4721]: I0202 13:22:15.342904 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"4c89d3977af7fbb3779c0661dadea0111ac2d8f3c3974c534b682ad6a4af4aac"} Feb 02 13:22:15 crc kubenswrapper[4721]: I0202 13:22:15.342939 4721 scope.go:117] "RemoveContainer" containerID="3e250ca33160c82bc83b5c1d01cc482ebd55cdb3c1b9ae291d6af786cb617e66" Feb 02 13:22:15 crc kubenswrapper[4721]: I0202 13:22:15.346062 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"2505bd6b-64d4-4d17-9c1a-0e89562612be","Type":"ContainerStarted","Data":"830088dbd5ee4718481c9e7cbef3e103bc42082b48115d388f8f6e997fb15bf0"} Feb 02 13:22:15 crc kubenswrapper[4721]: I0202 13:22:15.360367 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4","Type":"ContainerStarted","Data":"25c0fe8095f3ecd0c419a580ee1f9340592817f3c826d9555d60ff4e92044995"} Feb 02 13:22:15 crc kubenswrapper[4721]: I0202 13:22:15.404795 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.416191224 podStartE2EDuration="39.404776862s" podCreationTimestamp="2026-02-02 13:21:36 +0000 UTC" firstStartedPulling="2026-02-02 13:21:56.891148097 +0000 UTC m=+1257.193662496" lastFinishedPulling="2026-02-02 13:22:07.879733755 +0000 UTC m=+1268.182248134" observedRunningTime="2026-02-02 13:22:15.399922131 +0000 UTC m=+1275.702436520" watchObservedRunningTime="2026-02-02 13:22:15.404776862 +0000 UTC m=+1275.707291251" Feb 02 13:22:15 crc kubenswrapper[4721]: I0202 13:22:15.431119 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.948187148 podStartE2EDuration="38.431098954s" podCreationTimestamp="2026-02-02 13:21:37 +0000 UTC" firstStartedPulling="2026-02-02 13:21:54.255263182 +0000 UTC m=+1254.557777571" lastFinishedPulling="2026-02-02 13:22:07.738174988 +0000 UTC m=+1268.040689377" observedRunningTime="2026-02-02 13:22:15.41947489 +0000 UTC m=+1275.721989279" watchObservedRunningTime="2026-02-02 13:22:15.431098954 +0000 UTC m=+1275.733613353" Feb 02 13:22:15 crc kubenswrapper[4721]: I0202 13:22:15.521910 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 13:22:16 crc kubenswrapper[4721]: I0202 13:22:16.386590 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd5938c1-e4b9-4437-a379-c25bc5b1c243","Type":"ContainerStarted","Data":"90892aafab2664cc738132e88bd477c123af707400f227ca6b329a637aeecfae"} Feb 02 13:22:17 crc kubenswrapper[4721]: I0202 13:22:17.398005 4721 generic.go:334] "Generic (PLEG): container finished" podID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerID="f51717e20215c350c32f4c0b374e223fc4a90aedd712f943c01301452ac10dc6" exitCode=0 Feb 02 13:22:17 crc kubenswrapper[4721]: I0202 13:22:17.398316 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerDied","Data":"f51717e20215c350c32f4c0b374e223fc4a90aedd712f943c01301452ac10dc6"} Feb 02 13:22:17 crc kubenswrapper[4721]: I0202 13:22:17.401974 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd5938c1-e4b9-4437-a379-c25bc5b1c243","Type":"ContainerStarted","Data":"87693f073ac526e5ae970991a9a995206cb8ae50f02087847db9ea2ee617f6e9"} Feb 02 13:22:17 crc kubenswrapper[4721]: I0202 13:22:17.402014 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd5938c1-e4b9-4437-a379-c25bc5b1c243","Type":"ContainerStarted","Data":"6c33e1641498bc27664c341a6ed9c261e33636435d7ce39409d4a5e987fc1e46"} Feb 02 13:22:17 crc kubenswrapper[4721]: I0202 13:22:17.402124 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 02 13:22:17 crc kubenswrapper[4721]: I0202 13:22:17.470811 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.396409971 podStartE2EDuration="3.470790029s" podCreationTimestamp="2026-02-02 13:22:14 +0000 UTC" firstStartedPulling="2026-02-02 13:22:15.512609707 +0000 UTC m=+1275.815124096" lastFinishedPulling="2026-02-02 13:22:16.586989765 +0000 UTC m=+1276.889504154" observedRunningTime="2026-02-02 13:22:17.470240955 +0000 UTC m=+1277.772755354" watchObservedRunningTime="2026-02-02 13:22:17.470790029 +0000 UTC m=+1277.773304418" Feb 02 13:22:17 crc kubenswrapper[4721]: I0202 13:22:17.604558 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 02 13:22:17 crc kubenswrapper[4721]: I0202 13:22:17.604624 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 02 13:22:19 crc kubenswrapper[4721]: I0202 13:22:19.363051 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 02 13:22:19 crc kubenswrapper[4721]: I0202 13:22:19.363498 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 02 13:22:19 crc kubenswrapper[4721]: I0202 13:22:19.917285 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:19 crc kubenswrapper[4721]: I0202 13:22:19.920949 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 02 13:22:19 crc kubenswrapper[4721]: I0202 13:22:19.988854 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ljjc5"] Feb 02 13:22:19 crc kubenswrapper[4721]: I0202 13:22:19.989155 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" podUID="f9598cea-e831-47fd-aa1a-08060e23bba2" containerName="dnsmasq-dns" containerID="cri-o://79ee8e8028b6aa26c38e11ef05e770e5dd8bbf55490a89dcd0603734b8ebe97e" gracePeriod=10 Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.054391 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.148367 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" podUID="f9598cea-e831-47fd-aa1a-08060e23bba2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.456124 4721 generic.go:334] "Generic (PLEG): container finished" podID="f9598cea-e831-47fd-aa1a-08060e23bba2" containerID="79ee8e8028b6aa26c38e11ef05e770e5dd8bbf55490a89dcd0603734b8ebe97e" exitCode=0 Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.456210 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" event={"ID":"f9598cea-e831-47fd-aa1a-08060e23bba2","Type":"ContainerDied","Data":"79ee8e8028b6aa26c38e11ef05e770e5dd8bbf55490a89dcd0603734b8ebe97e"} Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.589095 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.681013 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-config\") pod \"f9598cea-e831-47fd-aa1a-08060e23bba2\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.681247 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-dns-svc\") pod \"f9598cea-e831-47fd-aa1a-08060e23bba2\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.681283 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpqwx\" (UniqueName: \"kubernetes.io/projected/f9598cea-e831-47fd-aa1a-08060e23bba2-kube-api-access-rpqwx\") pod \"f9598cea-e831-47fd-aa1a-08060e23bba2\" (UID: \"f9598cea-e831-47fd-aa1a-08060e23bba2\") " Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.688307 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9598cea-e831-47fd-aa1a-08060e23bba2-kube-api-access-rpqwx" (OuterVolumeSpecName: "kube-api-access-rpqwx") pod "f9598cea-e831-47fd-aa1a-08060e23bba2" (UID: "f9598cea-e831-47fd-aa1a-08060e23bba2"). InnerVolumeSpecName "kube-api-access-rpqwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.732036 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-config" (OuterVolumeSpecName: "config") pod "f9598cea-e831-47fd-aa1a-08060e23bba2" (UID: "f9598cea-e831-47fd-aa1a-08060e23bba2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.738556 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9598cea-e831-47fd-aa1a-08060e23bba2" (UID: "f9598cea-e831-47fd-aa1a-08060e23bba2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.784028 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.784086 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9598cea-e831-47fd-aa1a-08060e23bba2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:20 crc kubenswrapper[4721]: I0202 13:22:20.784100 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpqwx\" (UniqueName: \"kubernetes.io/projected/f9598cea-e831-47fd-aa1a-08060e23bba2-kube-api-access-rpqwx\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.383642 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4d7hn"] Feb 02 13:22:21 crc kubenswrapper[4721]: E0202 13:22:21.384161 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9598cea-e831-47fd-aa1a-08060e23bba2" containerName="dnsmasq-dns" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.384184 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9598cea-e831-47fd-aa1a-08060e23bba2" containerName="dnsmasq-dns" Feb 02 13:22:21 crc kubenswrapper[4721]: E0202 13:22:21.384228 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9598cea-e831-47fd-aa1a-08060e23bba2" containerName="init" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.384235 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9598cea-e831-47fd-aa1a-08060e23bba2" containerName="init" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.384483 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9598cea-e831-47fd-aa1a-08060e23bba2" containerName="dnsmasq-dns" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.385415 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.418554 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4d7hn"] Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.465445 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-04a7-account-create-update-xhlq8"] Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.467007 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.470506 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.482747 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" event={"ID":"f9598cea-e831-47fd-aa1a-08060e23bba2","Type":"ContainerDied","Data":"dcd5d79dba084b60aa8e85cba431f52f20415983b8d07817faa29db86eef35ea"} Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.482807 4721 scope.go:117] "RemoveContainer" containerID="79ee8e8028b6aa26c38e11ef05e770e5dd8bbf55490a89dcd0603734b8ebe97e" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.482987 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ljjc5" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.482999 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-04a7-account-create-update-xhlq8"] Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.496879 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s645b\" (UniqueName: \"kubernetes.io/projected/b5e72b39-6085-4753-8b7d-a93a80c95d49-kube-api-access-s645b\") pod \"mysqld-exporter-openstack-db-create-4d7hn\" (UID: \"b5e72b39-6085-4753-8b7d-a93a80c95d49\") " pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.497218 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e72b39-6085-4753-8b7d-a93a80c95d49-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-4d7hn\" (UID: \"b5e72b39-6085-4753-8b7d-a93a80c95d49\") " pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.515516 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.550433 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-x4f5m"] Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.553677 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.576929 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-x4f5m"] Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.603476 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-operator-scripts\") pod \"mysqld-exporter-04a7-account-create-update-xhlq8\" (UID: \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\") " pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.603556 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wknbf\" (UniqueName: \"kubernetes.io/projected/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-kube-api-access-wknbf\") pod \"mysqld-exporter-04a7-account-create-update-xhlq8\" (UID: \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\") " pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.603606 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e72b39-6085-4753-8b7d-a93a80c95d49-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-4d7hn\" (UID: \"b5e72b39-6085-4753-8b7d-a93a80c95d49\") " pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.603724 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s645b\" (UniqueName: \"kubernetes.io/projected/b5e72b39-6085-4753-8b7d-a93a80c95d49-kube-api-access-s645b\") pod \"mysqld-exporter-openstack-db-create-4d7hn\" (UID: \"b5e72b39-6085-4753-8b7d-a93a80c95d49\") " pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.604977 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e72b39-6085-4753-8b7d-a93a80c95d49-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-4d7hn\" (UID: \"b5e72b39-6085-4753-8b7d-a93a80c95d49\") " pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.610057 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ljjc5"] Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.634651 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s645b\" (UniqueName: \"kubernetes.io/projected/b5e72b39-6085-4753-8b7d-a93a80c95d49-kube-api-access-s645b\") pod \"mysqld-exporter-openstack-db-create-4d7hn\" (UID: \"b5e72b39-6085-4753-8b7d-a93a80c95d49\") " pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.641265 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ljjc5"] Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.705565 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2hmg\" (UniqueName: \"kubernetes.io/projected/a12cebe8-c719-4841-8d01-e9faf9b745cf-kube-api-access-h2hmg\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.705629 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-dns-svc\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.706054 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.706272 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.706295 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-operator-scripts\") pod \"mysqld-exporter-04a7-account-create-update-xhlq8\" (UID: \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\") " pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.706406 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wknbf\" (UniqueName: \"kubernetes.io/projected/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-kube-api-access-wknbf\") pod \"mysqld-exporter-04a7-account-create-update-xhlq8\" (UID: \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\") " pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.706488 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.706513 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-config\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.707361 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-operator-scripts\") pod \"mysqld-exporter-04a7-account-create-update-xhlq8\" (UID: \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\") " pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.718740 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.750354 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wknbf\" (UniqueName: \"kubernetes.io/projected/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-kube-api-access-wknbf\") pod \"mysqld-exporter-04a7-account-create-update-xhlq8\" (UID: \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\") " pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.807903 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.807961 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-config\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.808006 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2hmg\" (UniqueName: \"kubernetes.io/projected/a12cebe8-c719-4841-8d01-e9faf9b745cf-kube-api-access-h2hmg\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.808037 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-dns-svc\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.808215 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.812996 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-config\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.813290 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-dns-svc\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.813305 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.813445 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.827739 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2hmg\" (UniqueName: \"kubernetes.io/projected/a12cebe8-c719-4841-8d01-e9faf9b745cf-kube-api-access-h2hmg\") pod \"dnsmasq-dns-698758b865-x4f5m\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.832674 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.854431 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 02 13:22:21 crc kubenswrapper[4721]: I0202 13:22:21.893449 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.425579 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9598cea-e831-47fd-aa1a-08060e23bba2" path="/var/lib/kubelet/pods/f9598cea-e831-47fd-aa1a-08060e23bba2/volumes" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.646830 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.683814 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.687686 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.687745 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.687826 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2jt7r" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.687976 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.704684 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.728429 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabe6b07-da9d-4980-99b4-12c02640c88d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.728530 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eabe6b07-da9d-4980-99b4-12c02640c88d-lock\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.728551 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2445c7d5-729e-46d6-8fce-b5ddd9535e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2445c7d5-729e-46d6-8fce-b5ddd9535e1d\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.728633 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.728650 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eabe6b07-da9d-4980-99b4-12c02640c88d-cache\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.728666 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9mc9\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-kube-api-access-l9mc9\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.830531 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eabe6b07-da9d-4980-99b4-12c02640c88d-lock\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.830594 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2445c7d5-729e-46d6-8fce-b5ddd9535e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2445c7d5-729e-46d6-8fce-b5ddd9535e1d\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.830728 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.830757 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eabe6b07-da9d-4980-99b4-12c02640c88d-cache\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.830783 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mc9\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-kube-api-access-l9mc9\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.830884 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabe6b07-da9d-4980-99b4-12c02640c88d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.831985 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eabe6b07-da9d-4980-99b4-12c02640c88d-lock\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.832083 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eabe6b07-da9d-4980-99b4-12c02640c88d-cache\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: E0202 13:22:22.833234 4721 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 13:22:22 crc kubenswrapper[4721]: E0202 13:22:22.833348 4721 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 13:22:22 crc kubenswrapper[4721]: E0202 13:22:22.833482 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift podName:eabe6b07-da9d-4980-99b4-12c02640c88d nodeName:}" failed. No retries permitted until 2026-02-02 13:22:23.333463615 +0000 UTC m=+1283.635978004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift") pod "swift-storage-0" (UID: "eabe6b07-da9d-4980-99b4-12c02640c88d") : configmap "swift-ring-files" not found Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.834609 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.834641 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2445c7d5-729e-46d6-8fce-b5ddd9535e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2445c7d5-729e-46d6-8fce-b5ddd9535e1d\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b744527ddf0b30b1cc5ab21b1766bbfac23b2e50aed8c717fd9a8009cfeccd09/globalmount\"" pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.837546 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eabe6b07-da9d-4980-99b4-12c02640c88d-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.860988 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9mc9\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-kube-api-access-l9mc9\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:22 crc kubenswrapper[4721]: I0202 13:22:22.877342 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2445c7d5-729e-46d6-8fce-b5ddd9535e1d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2445c7d5-729e-46d6-8fce-b5ddd9535e1d\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.170714 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-4rnrx"] Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.172535 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.175520 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.176738 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.176769 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.198344 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4rnrx"] Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.238833 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-scripts\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.239031 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-combined-ca-bundle\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.239099 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-ring-data-devices\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.239125 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-etc-swift\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.239276 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-swiftconf\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.239338 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-dispersionconf\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.239450 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5rm5\" (UniqueName: \"kubernetes.io/projected/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-kube-api-access-q5rm5\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.341245 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-etc-swift\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.341293 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-swiftconf\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.341324 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-dispersionconf\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.341384 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5rm5\" (UniqueName: \"kubernetes.io/projected/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-kube-api-access-q5rm5\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.341861 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-scripts\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.342003 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-combined-ca-bundle\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.342031 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.342049 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-ring-data-devices\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.342467 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-etc-swift\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: E0202 13:22:23.342636 4721 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 13:22:23 crc kubenswrapper[4721]: E0202 13:22:23.342664 4721 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 13:22:23 crc kubenswrapper[4721]: E0202 13:22:23.342723 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift podName:eabe6b07-da9d-4980-99b4-12c02640c88d nodeName:}" failed. No retries permitted until 2026-02-02 13:22:24.342700393 +0000 UTC m=+1284.645214792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift") pod "swift-storage-0" (UID: "eabe6b07-da9d-4980-99b4-12c02640c88d") : configmap "swift-ring-files" not found Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.343215 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-scripts\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.343500 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-ring-data-devices\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.347515 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-swiftconf\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.348562 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-combined-ca-bundle\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.358133 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5rm5\" (UniqueName: \"kubernetes.io/projected/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-kube-api-access-q5rm5\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.361341 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-dispersionconf\") pod \"swift-ring-rebalance-4rnrx\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:23 crc kubenswrapper[4721]: I0202 13:22:23.499932 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.360151 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:24 crc kubenswrapper[4721]: E0202 13:22:24.360301 4721 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 13:22:24 crc kubenswrapper[4721]: E0202 13:22:24.360504 4721 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 13:22:24 crc kubenswrapper[4721]: E0202 13:22:24.360555 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift podName:eabe6b07-da9d-4980-99b4-12c02640c88d nodeName:}" failed. No retries permitted until 2026-02-02 13:22:26.360541662 +0000 UTC m=+1286.663056051 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift") pod "swift-storage-0" (UID: "eabe6b07-da9d-4980-99b4-12c02640c88d") : configmap "swift-ring-files" not found Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.424289 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-sl4gx"] Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.425626 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.428058 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sl4gx"] Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.461728 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-operator-scripts\") pod \"glance-db-create-sl4gx\" (UID: \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\") " pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.462653 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n99fz\" (UniqueName: \"kubernetes.io/projected/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-kube-api-access-n99fz\") pod \"glance-db-create-sl4gx\" (UID: \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\") " pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.519133 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1d7e-account-create-update-7jmk5"] Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.520533 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.522262 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.535037 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1d7e-account-create-update-7jmk5"] Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.569665 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q75bj\" (UniqueName: \"kubernetes.io/projected/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-kube-api-access-q75bj\") pod \"glance-1d7e-account-create-update-7jmk5\" (UID: \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\") " pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.569854 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-operator-scripts\") pod \"glance-db-create-sl4gx\" (UID: \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\") " pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.570086 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n99fz\" (UniqueName: \"kubernetes.io/projected/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-kube-api-access-n99fz\") pod \"glance-db-create-sl4gx\" (UID: \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\") " pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.570212 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-operator-scripts\") pod \"glance-1d7e-account-create-update-7jmk5\" (UID: \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\") " pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.571279 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-operator-scripts\") pod \"glance-db-create-sl4gx\" (UID: \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\") " pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.595054 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n99fz\" (UniqueName: \"kubernetes.io/projected/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-kube-api-access-n99fz\") pod \"glance-db-create-sl4gx\" (UID: \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\") " pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.672266 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-operator-scripts\") pod \"glance-1d7e-account-create-update-7jmk5\" (UID: \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\") " pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.672383 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q75bj\" (UniqueName: \"kubernetes.io/projected/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-kube-api-access-q75bj\") pod \"glance-1d7e-account-create-update-7jmk5\" (UID: \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\") " pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.673006 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-operator-scripts\") pod \"glance-1d7e-account-create-update-7jmk5\" (UID: \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\") " pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.695535 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q75bj\" (UniqueName: \"kubernetes.io/projected/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-kube-api-access-q75bj\") pod \"glance-1d7e-account-create-update-7jmk5\" (UID: \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\") " pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.751740 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:24 crc kubenswrapper[4721]: I0202 13:22:24.838737 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:25 crc kubenswrapper[4721]: I0202 13:22:25.604287 4721 scope.go:117] "RemoveContainer" containerID="a20ee72f3c2584aa99632a64dbf1cb3b2b0b3ad83db068faf7bb964a8d3d6314" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.146203 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vfbpx"] Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.148903 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.152352 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.166199 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vfbpx"] Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.210419 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-operator-scripts\") pod \"root-account-create-update-vfbpx\" (UID: \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\") " pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.210483 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6dcx\" (UniqueName: \"kubernetes.io/projected/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-kube-api-access-p6dcx\") pod \"root-account-create-update-vfbpx\" (UID: \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\") " pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:26 crc kubenswrapper[4721]: W0202 13:22:26.217268 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda12cebe8_c719_4841_8d01_e9faf9b745cf.slice/crio-521bca06923fe78f1ba71782b798785e3b87c44a41b5a65d319554abd4047afe WatchSource:0}: Error finding container 521bca06923fe78f1ba71782b798785e3b87c44a41b5a65d319554abd4047afe: Status 404 returned error can't find the container with id 521bca06923fe78f1ba71782b798785e3b87c44a41b5a65d319554abd4047afe Feb 02 13:22:26 crc kubenswrapper[4721]: W0202 13:22:26.219755 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5e72b39_6085_4753_8b7d_a93a80c95d49.slice/crio-2a68a42d8895510ad7ca8c635add6a94961e7a340ade2d4390e22073b385345a WatchSource:0}: Error finding container 2a68a42d8895510ad7ca8c635add6a94961e7a340ade2d4390e22073b385345a: Status 404 returned error can't find the container with id 2a68a42d8895510ad7ca8c635add6a94961e7a340ade2d4390e22073b385345a Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.224021 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4d7hn"] Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.235183 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-x4f5m"] Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.312930 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-operator-scripts\") pod \"root-account-create-update-vfbpx\" (UID: \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\") " pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.313186 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6dcx\" (UniqueName: \"kubernetes.io/projected/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-kube-api-access-p6dcx\") pod \"root-account-create-update-vfbpx\" (UID: \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\") " pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.313825 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-operator-scripts\") pod \"root-account-create-update-vfbpx\" (UID: \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\") " pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.377669 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6dcx\" (UniqueName: \"kubernetes.io/projected/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-kube-api-access-p6dcx\") pod \"root-account-create-update-vfbpx\" (UID: \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\") " pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.420694 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:26 crc kubenswrapper[4721]: E0202 13:22:26.421358 4721 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 13:22:26 crc kubenswrapper[4721]: E0202 13:22:26.421377 4721 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 13:22:26 crc kubenswrapper[4721]: E0202 13:22:26.421420 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift podName:eabe6b07-da9d-4980-99b4-12c02640c88d nodeName:}" failed. No retries permitted until 2026-02-02 13:22:30.421403109 +0000 UTC m=+1290.723917498 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift") pod "swift-storage-0" (UID: "eabe6b07-da9d-4980-99b4-12c02640c88d") : configmap "swift-ring-files" not found Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.433326 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sl4gx"] Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.481335 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.531665 4721 generic.go:334] "Generic (PLEG): container finished" podID="a12cebe8-c719-4841-8d01-e9faf9b745cf" containerID="b3aeeebb46496223c552a8ed7c33309ec906c0d30db8ba232bc642182832692e" exitCode=0 Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.531744 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-x4f5m" event={"ID":"a12cebe8-c719-4841-8d01-e9faf9b745cf","Type":"ContainerDied","Data":"b3aeeebb46496223c552a8ed7c33309ec906c0d30db8ba232bc642182832692e"} Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.531777 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-x4f5m" event={"ID":"a12cebe8-c719-4841-8d01-e9faf9b745cf","Type":"ContainerStarted","Data":"521bca06923fe78f1ba71782b798785e3b87c44a41b5a65d319554abd4047afe"} Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.537564 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" event={"ID":"b5e72b39-6085-4753-8b7d-a93a80c95d49","Type":"ContainerStarted","Data":"ac2082a5d3a7b825797912c8d9660423dc0e0e1d5b6ff60e8c46690201c145fc"} Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.537619 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" event={"ID":"b5e72b39-6085-4753-8b7d-a93a80c95d49","Type":"ContainerStarted","Data":"2a68a42d8895510ad7ca8c635add6a94961e7a340ade2d4390e22073b385345a"} Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.539936 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sl4gx" event={"ID":"9d46c6f8-aff0-4b28-a71b-d98a894afdaf","Type":"ContainerStarted","Data":"139d705bb752a02c5df86ebbf850d7ede58efa5094bbbfe34e470f6b2bf2ddc6"} Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.547481 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerStarted","Data":"ee494f7d068c5f80477a3c22c3a31526d01c4a02a690967dc4ac1b911a158a98"} Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.577084 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" podStartSLOduration=5.577049147 podStartE2EDuration="5.577049147s" podCreationTimestamp="2026-02-02 13:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:22:26.566287456 +0000 UTC m=+1286.868801845" watchObservedRunningTime="2026-02-02 13:22:26.577049147 +0000 UTC m=+1286.879563536" Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.631472 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-04a7-account-create-update-xhlq8"] Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.657500 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4rnrx"] Feb 02 13:22:26 crc kubenswrapper[4721]: I0202 13:22:26.667428 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1d7e-account-create-update-7jmk5"] Feb 02 13:22:26 crc kubenswrapper[4721]: W0202 13:22:26.681629 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e1ef9e5_26ab_4b7b_b255_73968ed867ce.slice/crio-9a07e59cfbbbd47e4cc366cbab15fadcfeb5402edac13e46b25d05b3feb38e9f WatchSource:0}: Error finding container 9a07e59cfbbbd47e4cc366cbab15fadcfeb5402edac13e46b25d05b3feb38e9f: Status 404 returned error can't find the container with id 9a07e59cfbbbd47e4cc366cbab15fadcfeb5402edac13e46b25d05b3feb38e9f Feb 02 13:22:26 crc kubenswrapper[4721]: W0202 13:22:26.687613 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd1f15d5_77dc_4b6d_81bf_c2a8286da820.slice/crio-75aba502874704cff954aba43521e9024690a740951c41d1966d4ffb33d85812 WatchSource:0}: Error finding container 75aba502874704cff954aba43521e9024690a740951c41d1966d4ffb33d85812: Status 404 returned error can't find the container with id 75aba502874704cff954aba43521e9024690a740951c41d1966d4ffb33d85812 Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.030384 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vfbpx"] Feb 02 13:22:27 crc kubenswrapper[4721]: W0202 13:22:27.031121 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc47d165e_de4e_4f3a_8f66_4dab149c7b5e.slice/crio-d783d0e7af03e0407d90429cf0a634a80162978655ab4c2268c94981bd9e9d13 WatchSource:0}: Error finding container d783d0e7af03e0407d90429cf0a634a80162978655ab4c2268c94981bd9e9d13: Status 404 returned error can't find the container with id d783d0e7af03e0407d90429cf0a634a80162978655ab4c2268c94981bd9e9d13 Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.559233 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4rnrx" event={"ID":"bd1f15d5-77dc-4b6d-81bf-c2a8286da820","Type":"ContainerStarted","Data":"75aba502874704cff954aba43521e9024690a740951c41d1966d4ffb33d85812"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.561657 4721 generic.go:334] "Generic (PLEG): container finished" podID="c47d165e-de4e-4f3a-8f66-4dab149c7b5e" containerID="c8ac03b5a6a963dc432f18a1012252ac15cbdaf5a852eb90b3130207aa267b95" exitCode=0 Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.561742 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vfbpx" event={"ID":"c47d165e-de4e-4f3a-8f66-4dab149c7b5e","Type":"ContainerDied","Data":"c8ac03b5a6a963dc432f18a1012252ac15cbdaf5a852eb90b3130207aa267b95"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.561772 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vfbpx" event={"ID":"c47d165e-de4e-4f3a-8f66-4dab149c7b5e","Type":"ContainerStarted","Data":"d783d0e7af03e0407d90429cf0a634a80162978655ab4c2268c94981bd9e9d13"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.565501 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-x4f5m" event={"ID":"a12cebe8-c719-4841-8d01-e9faf9b745cf","Type":"ContainerStarted","Data":"76fa42f68bb28a61e9cdddef88da78612bb32973f43a836c40f835e5ed6d0856"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.565579 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.568490 4721 generic.go:334] "Generic (PLEG): container finished" podID="8b74e699-bc4f-4415-a9dc-8ad52d916bc0" containerID="fa62cc31d8fc9109f8c7236f7067b1ae22093077c72a6872a6dc77d5cf6674c5" exitCode=0 Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.568547 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" event={"ID":"8b74e699-bc4f-4415-a9dc-8ad52d916bc0","Type":"ContainerDied","Data":"fa62cc31d8fc9109f8c7236f7067b1ae22093077c72a6872a6dc77d5cf6674c5"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.568613 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" event={"ID":"8b74e699-bc4f-4415-a9dc-8ad52d916bc0","Type":"ContainerStarted","Data":"ac7c59871bbb981b1f42db9994a76f1de10fc29eb7b41e289dce1c711d0abf41"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.570281 4721 generic.go:334] "Generic (PLEG): container finished" podID="b5e72b39-6085-4753-8b7d-a93a80c95d49" containerID="ac2082a5d3a7b825797912c8d9660423dc0e0e1d5b6ff60e8c46690201c145fc" exitCode=0 Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.570351 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" event={"ID":"b5e72b39-6085-4753-8b7d-a93a80c95d49","Type":"ContainerDied","Data":"ac2082a5d3a7b825797912c8d9660423dc0e0e1d5b6ff60e8c46690201c145fc"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.572653 4721 generic.go:334] "Generic (PLEG): container finished" podID="4e1ef9e5-26ab-4b7b-b255-73968ed867ce" containerID="c036ed84a2cba404110a4db04b8c7d0f021199196a70d367772128ca1a327056" exitCode=0 Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.572702 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d7e-account-create-update-7jmk5" event={"ID":"4e1ef9e5-26ab-4b7b-b255-73968ed867ce","Type":"ContainerDied","Data":"c036ed84a2cba404110a4db04b8c7d0f021199196a70d367772128ca1a327056"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.572723 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d7e-account-create-update-7jmk5" event={"ID":"4e1ef9e5-26ab-4b7b-b255-73968ed867ce","Type":"ContainerStarted","Data":"9a07e59cfbbbd47e4cc366cbab15fadcfeb5402edac13e46b25d05b3feb38e9f"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.574165 4721 generic.go:334] "Generic (PLEG): container finished" podID="9d46c6f8-aff0-4b28-a71b-d98a894afdaf" containerID="4fba6951bb982a13a5360303ec96e98896da4f493023d6f3bda466f64f4a3da5" exitCode=0 Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.574255 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sl4gx" event={"ID":"9d46c6f8-aff0-4b28-a71b-d98a894afdaf","Type":"ContainerDied","Data":"4fba6951bb982a13a5360303ec96e98896da4f493023d6f3bda466f64f4a3da5"} Feb 02 13:22:27 crc kubenswrapper[4721]: I0202 13:22:27.677541 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-x4f5m" podStartSLOduration=6.677515599 podStartE2EDuration="6.677515599s" podCreationTimestamp="2026-02-02 13:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:22:27.648492925 +0000 UTC m=+1287.951007324" watchObservedRunningTime="2026-02-02 13:22:27.677515599 +0000 UTC m=+1287.980029998" Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.166722 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-679d56c757-8hcnt" podUID="1e715356-9848-439f-a13d-eb00f34521ec" containerName="console" containerID="cri-o://2105e396598b1fd13640d6e576494052ae1ae901f42a2b1dd0e7f495d1eec506" gracePeriod=15 Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.587336 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-679d56c757-8hcnt_1e715356-9848-439f-a13d-eb00f34521ec/console/0.log" Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.587427 4721 generic.go:334] "Generic (PLEG): container finished" podID="1e715356-9848-439f-a13d-eb00f34521ec" containerID="2105e396598b1fd13640d6e576494052ae1ae901f42a2b1dd0e7f495d1eec506" exitCode=2 Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.587598 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679d56c757-8hcnt" event={"ID":"1e715356-9848-439f-a13d-eb00f34521ec","Type":"ContainerDied","Data":"2105e396598b1fd13640d6e576494052ae1ae901f42a2b1dd0e7f495d1eec506"} Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.862900 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4msnh"] Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.864623 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.875758 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4msnh"] Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.973380 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-e588-account-create-update-4crm9"] Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.975052 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.979303 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.984954 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e588-account-create-update-4crm9"] Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.995193 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzx4c\" (UniqueName: \"kubernetes.io/projected/d51234ae-bf99-49bc-a3bc-1b392f993726-kube-api-access-gzx4c\") pod \"keystone-db-create-4msnh\" (UID: \"d51234ae-bf99-49bc-a3bc-1b392f993726\") " pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:28 crc kubenswrapper[4721]: I0202 13:22:28.995270 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d51234ae-bf99-49bc-a3bc-1b392f993726-operator-scripts\") pod \"keystone-db-create-4msnh\" (UID: \"d51234ae-bf99-49bc-a3bc-1b392f993726\") " pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.096810 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d51234ae-bf99-49bc-a3bc-1b392f993726-operator-scripts\") pod \"keystone-db-create-4msnh\" (UID: \"d51234ae-bf99-49bc-a3bc-1b392f993726\") " pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.096962 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpscl\" (UniqueName: \"kubernetes.io/projected/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-kube-api-access-kpscl\") pod \"keystone-e588-account-create-update-4crm9\" (UID: \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\") " pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.097044 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzx4c\" (UniqueName: \"kubernetes.io/projected/d51234ae-bf99-49bc-a3bc-1b392f993726-kube-api-access-gzx4c\") pod \"keystone-db-create-4msnh\" (UID: \"d51234ae-bf99-49bc-a3bc-1b392f993726\") " pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.097103 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-operator-scripts\") pod \"keystone-e588-account-create-update-4crm9\" (UID: \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\") " pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.097716 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d51234ae-bf99-49bc-a3bc-1b392f993726-operator-scripts\") pod \"keystone-db-create-4msnh\" (UID: \"d51234ae-bf99-49bc-a3bc-1b392f993726\") " pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.129949 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzx4c\" (UniqueName: \"kubernetes.io/projected/d51234ae-bf99-49bc-a3bc-1b392f993726-kube-api-access-gzx4c\") pod \"keystone-db-create-4msnh\" (UID: \"d51234ae-bf99-49bc-a3bc-1b392f993726\") " pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.201636 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-operator-scripts\") pod \"keystone-e588-account-create-update-4crm9\" (UID: \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\") " pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.201856 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpscl\" (UniqueName: \"kubernetes.io/projected/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-kube-api-access-kpscl\") pod \"keystone-e588-account-create-update-4crm9\" (UID: \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\") " pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.202994 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.203099 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-operator-scripts\") pod \"keystone-e588-account-create-update-4crm9\" (UID: \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\") " pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.245870 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-b945b"] Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.247214 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpscl\" (UniqueName: \"kubernetes.io/projected/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-kube-api-access-kpscl\") pod \"keystone-e588-account-create-update-4crm9\" (UID: \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\") " pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.250297 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b945b" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.291713 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b945b"] Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.300866 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.329601 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f5fd-account-create-update-8s8md"] Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.331166 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.335392 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.339883 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f5fd-account-create-update-8s8md"] Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.406615 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-operator-scripts\") pod \"placement-db-create-b945b\" (UID: \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\") " pod="openstack/placement-db-create-b945b" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.407100 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmwlz\" (UniqueName: \"kubernetes.io/projected/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-kube-api-access-mmwlz\") pod \"placement-db-create-b945b\" (UID: \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\") " pod="openstack/placement-db-create-b945b" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.509794 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lfn7\" (UniqueName: \"kubernetes.io/projected/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-kube-api-access-8lfn7\") pod \"placement-f5fd-account-create-update-8s8md\" (UID: \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\") " pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.510209 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-operator-scripts\") pod \"placement-f5fd-account-create-update-8s8md\" (UID: \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\") " pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.510613 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmwlz\" (UniqueName: \"kubernetes.io/projected/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-kube-api-access-mmwlz\") pod \"placement-db-create-b945b\" (UID: \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\") " pod="openstack/placement-db-create-b945b" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.510949 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-operator-scripts\") pod \"placement-db-create-b945b\" (UID: \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\") " pod="openstack/placement-db-create-b945b" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.513536 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-operator-scripts\") pod \"placement-db-create-b945b\" (UID: \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\") " pod="openstack/placement-db-create-b945b" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.526927 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmwlz\" (UniqueName: \"kubernetes.io/projected/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-kube-api-access-mmwlz\") pod \"placement-db-create-b945b\" (UID: \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\") " pod="openstack/placement-db-create-b945b" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.601340 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerStarted","Data":"cac1e1e212591677852178e9111f4a55b1b7ae8d7ed3f2354152657d155f18e1"} Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.613663 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lfn7\" (UniqueName: \"kubernetes.io/projected/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-kube-api-access-8lfn7\") pod \"placement-f5fd-account-create-update-8s8md\" (UID: \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\") " pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.613753 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-operator-scripts\") pod \"placement-f5fd-account-create-update-8s8md\" (UID: \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\") " pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.614945 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-operator-scripts\") pod \"placement-f5fd-account-create-update-8s8md\" (UID: \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\") " pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.621699 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b945b" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.633161 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lfn7\" (UniqueName: \"kubernetes.io/projected/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-kube-api-access-8lfn7\") pod \"placement-f5fd-account-create-update-8s8md\" (UID: \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\") " pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:29 crc kubenswrapper[4721]: I0202 13:22:29.668453 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.432912 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:30 crc kubenswrapper[4721]: E0202 13:22:30.433461 4721 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 13:22:30 crc kubenswrapper[4721]: E0202 13:22:30.433478 4721 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 13:22:30 crc kubenswrapper[4721]: E0202 13:22:30.433521 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift podName:eabe6b07-da9d-4980-99b4-12c02640c88d nodeName:}" failed. No retries permitted until 2026-02-02 13:22:38.43350631 +0000 UTC m=+1298.736020699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift") pod "swift-storage-0" (UID: "eabe6b07-da9d-4980-99b4-12c02640c88d") : configmap "swift-ring-files" not found Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.529526 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.574099 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.576135 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.594437 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.622404 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.626890 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sl4gx" event={"ID":"9d46c6f8-aff0-4b28-a71b-d98a894afdaf","Type":"ContainerDied","Data":"139d705bb752a02c5df86ebbf850d7ede58efa5094bbbfe34e470f6b2bf2ddc6"} Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.626914 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sl4gx" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.626933 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="139d705bb752a02c5df86ebbf850d7ede58efa5094bbbfe34e470f6b2bf2ddc6" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.628827 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vfbpx" event={"ID":"c47d165e-de4e-4f3a-8f66-4dab149c7b5e","Type":"ContainerDied","Data":"d783d0e7af03e0407d90429cf0a634a80162978655ab4c2268c94981bd9e9d13"} Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.628857 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vfbpx" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.628867 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d783d0e7af03e0407d90429cf0a634a80162978655ab4c2268c94981bd9e9d13" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.633791 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" event={"ID":"8b74e699-bc4f-4415-a9dc-8ad52d916bc0","Type":"ContainerDied","Data":"ac7c59871bbb981b1f42db9994a76f1de10fc29eb7b41e289dce1c711d0abf41"} Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.633856 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac7c59871bbb981b1f42db9994a76f1de10fc29eb7b41e289dce1c711d0abf41" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.633920 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-04a7-account-create-update-xhlq8" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.639440 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wknbf\" (UniqueName: \"kubernetes.io/projected/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-kube-api-access-wknbf\") pod \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\" (UID: \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.639516 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-operator-scripts\") pod \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\" (UID: \"8b74e699-bc4f-4415-a9dc-8ad52d916bc0\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.639549 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" event={"ID":"b5e72b39-6085-4753-8b7d-a93a80c95d49","Type":"ContainerDied","Data":"2a68a42d8895510ad7ca8c635add6a94961e7a340ade2d4390e22073b385345a"} Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.639577 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a68a42d8895510ad7ca8c635add6a94961e7a340ade2d4390e22073b385345a" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.639628 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-4d7hn" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.640925 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b74e699-bc4f-4415-a9dc-8ad52d916bc0" (UID: "8b74e699-bc4f-4415-a9dc-8ad52d916bc0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.643135 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d7e-account-create-update-7jmk5" event={"ID":"4e1ef9e5-26ab-4b7b-b255-73968ed867ce","Type":"ContainerDied","Data":"9a07e59cfbbbd47e4cc366cbab15fadcfeb5402edac13e46b25d05b3feb38e9f"} Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.643175 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a07e59cfbbbd47e4cc366cbab15fadcfeb5402edac13e46b25d05b3feb38e9f" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.643232 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d7e-account-create-update-7jmk5" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.650263 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-kube-api-access-wknbf" (OuterVolumeSpecName: "kube-api-access-wknbf") pod "8b74e699-bc4f-4415-a9dc-8ad52d916bc0" (UID: "8b74e699-bc4f-4415-a9dc-8ad52d916bc0"). InnerVolumeSpecName "kube-api-access-wknbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.742117 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e72b39-6085-4753-8b7d-a93a80c95d49-operator-scripts\") pod \"b5e72b39-6085-4753-8b7d-a93a80c95d49\" (UID: \"b5e72b39-6085-4753-8b7d-a93a80c95d49\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.742447 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6dcx\" (UniqueName: \"kubernetes.io/projected/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-kube-api-access-p6dcx\") pod \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\" (UID: \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.742564 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e72b39-6085-4753-8b7d-a93a80c95d49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5e72b39-6085-4753-8b7d-a93a80c95d49" (UID: "b5e72b39-6085-4753-8b7d-a93a80c95d49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.742677 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-operator-scripts\") pod \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\" (UID: \"c47d165e-de4e-4f3a-8f66-4dab149c7b5e\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.742734 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q75bj\" (UniqueName: \"kubernetes.io/projected/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-kube-api-access-q75bj\") pod \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\" (UID: \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.742793 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-operator-scripts\") pod \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\" (UID: \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.742874 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s645b\" (UniqueName: \"kubernetes.io/projected/b5e72b39-6085-4753-8b7d-a93a80c95d49-kube-api-access-s645b\") pod \"b5e72b39-6085-4753-8b7d-a93a80c95d49\" (UID: \"b5e72b39-6085-4753-8b7d-a93a80c95d49\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.742908 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n99fz\" (UniqueName: \"kubernetes.io/projected/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-kube-api-access-n99fz\") pod \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\" (UID: \"9d46c6f8-aff0-4b28-a71b-d98a894afdaf\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.742945 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-operator-scripts\") pod \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\" (UID: \"4e1ef9e5-26ab-4b7b-b255-73968ed867ce\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.743903 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e1ef9e5-26ab-4b7b-b255-73968ed867ce" (UID: "4e1ef9e5-26ab-4b7b-b255-73968ed867ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.744305 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c47d165e-de4e-4f3a-8f66-4dab149c7b5e" (UID: "c47d165e-de4e-4f3a-8f66-4dab149c7b5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.744355 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wknbf\" (UniqueName: \"kubernetes.io/projected/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-kube-api-access-wknbf\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.744389 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5e72b39-6085-4753-8b7d-a93a80c95d49-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.744398 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b74e699-bc4f-4415-a9dc-8ad52d916bc0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.745459 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d46c6f8-aff0-4b28-a71b-d98a894afdaf" (UID: "9d46c6f8-aff0-4b28-a71b-d98a894afdaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.748637 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-kube-api-access-q75bj" (OuterVolumeSpecName: "kube-api-access-q75bj") pod "4e1ef9e5-26ab-4b7b-b255-73968ed867ce" (UID: "4e1ef9e5-26ab-4b7b-b255-73968ed867ce"). InnerVolumeSpecName "kube-api-access-q75bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.750851 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-kube-api-access-p6dcx" (OuterVolumeSpecName: "kube-api-access-p6dcx") pod "c47d165e-de4e-4f3a-8f66-4dab149c7b5e" (UID: "c47d165e-de4e-4f3a-8f66-4dab149c7b5e"). InnerVolumeSpecName "kube-api-access-p6dcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.750897 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-kube-api-access-n99fz" (OuterVolumeSpecName: "kube-api-access-n99fz") pod "9d46c6f8-aff0-4b28-a71b-d98a894afdaf" (UID: "9d46c6f8-aff0-4b28-a71b-d98a894afdaf"). InnerVolumeSpecName "kube-api-access-n99fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.750941 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e72b39-6085-4753-8b7d-a93a80c95d49-kube-api-access-s645b" (OuterVolumeSpecName: "kube-api-access-s645b") pod "b5e72b39-6085-4753-8b7d-a93a80c95d49" (UID: "b5e72b39-6085-4753-8b7d-a93a80c95d49"). InnerVolumeSpecName "kube-api-access-s645b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.790290 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-679d56c757-8hcnt_1e715356-9848-439f-a13d-eb00f34521ec/console/0.log" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.790389 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.849143 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6dcx\" (UniqueName: \"kubernetes.io/projected/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-kube-api-access-p6dcx\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.849173 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c47d165e-de4e-4f3a-8f66-4dab149c7b5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.849183 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q75bj\" (UniqueName: \"kubernetes.io/projected/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-kube-api-access-q75bj\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.849192 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.849201 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s645b\" (UniqueName: \"kubernetes.io/projected/b5e72b39-6085-4753-8b7d-a93a80c95d49-kube-api-access-s645b\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.849211 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n99fz\" (UniqueName: \"kubernetes.io/projected/9d46c6f8-aff0-4b28-a71b-d98a894afdaf-kube-api-access-n99fz\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.849219 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e1ef9e5-26ab-4b7b-b255-73968ed867ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.950369 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xhnw\" (UniqueName: \"kubernetes.io/projected/1e715356-9848-439f-a13d-eb00f34521ec-kube-api-access-7xhnw\") pod \"1e715356-9848-439f-a13d-eb00f34521ec\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.950446 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-oauth-config\") pod \"1e715356-9848-439f-a13d-eb00f34521ec\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.950492 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-console-config\") pod \"1e715356-9848-439f-a13d-eb00f34521ec\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.950693 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-trusted-ca-bundle\") pod \"1e715356-9848-439f-a13d-eb00f34521ec\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.950735 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-serving-cert\") pod \"1e715356-9848-439f-a13d-eb00f34521ec\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.950816 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-service-ca\") pod \"1e715356-9848-439f-a13d-eb00f34521ec\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.950870 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-oauth-serving-cert\") pod \"1e715356-9848-439f-a13d-eb00f34521ec\" (UID: \"1e715356-9848-439f-a13d-eb00f34521ec\") " Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.955956 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-service-ca" (OuterVolumeSpecName: "service-ca") pod "1e715356-9848-439f-a13d-eb00f34521ec" (UID: "1e715356-9848-439f-a13d-eb00f34521ec"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.956775 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1e715356-9848-439f-a13d-eb00f34521ec" (UID: "1e715356-9848-439f-a13d-eb00f34521ec"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.956797 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-console-config" (OuterVolumeSpecName: "console-config") pod "1e715356-9848-439f-a13d-eb00f34521ec" (UID: "1e715356-9848-439f-a13d-eb00f34521ec"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.956804 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1e715356-9848-439f-a13d-eb00f34521ec" (UID: "1e715356-9848-439f-a13d-eb00f34521ec"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.961716 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e715356-9848-439f-a13d-eb00f34521ec-kube-api-access-7xhnw" (OuterVolumeSpecName: "kube-api-access-7xhnw") pod "1e715356-9848-439f-a13d-eb00f34521ec" (UID: "1e715356-9848-439f-a13d-eb00f34521ec"). InnerVolumeSpecName "kube-api-access-7xhnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.961730 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1e715356-9848-439f-a13d-eb00f34521ec" (UID: "1e715356-9848-439f-a13d-eb00f34521ec"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:22:30 crc kubenswrapper[4721]: I0202 13:22:30.963287 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1e715356-9848-439f-a13d-eb00f34521ec" (UID: "1e715356-9848-439f-a13d-eb00f34521ec"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.054329 4721 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.054363 4721 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.054374 4721 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.054383 4721 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.054392 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xhnw\" (UniqueName: \"kubernetes.io/projected/1e715356-9848-439f-a13d-eb00f34521ec-kube-api-access-7xhnw\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.054404 4721 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e715356-9848-439f-a13d-eb00f34521ec-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.054416 4721 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e715356-9848-439f-a13d-eb00f34521ec-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.168578 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f5fd-account-create-update-8s8md"] Feb 02 13:22:31 crc kubenswrapper[4721]: W0202 13:22:31.183146 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8261a2f3_c66a_441c_9fc6_a7a6a744b8a3.slice/crio-5e42e28e547417761133e79bed203e5b34892ed4d47a98b44e202ab8a75a5f74 WatchSource:0}: Error finding container 5e42e28e547417761133e79bed203e5b34892ed4d47a98b44e202ab8a75a5f74: Status 404 returned error can't find the container with id 5e42e28e547417761133e79bed203e5b34892ed4d47a98b44e202ab8a75a5f74 Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.184371 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e588-account-create-update-4crm9"] Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.199761 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-b945b"] Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.207211 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4msnh"] Feb 02 13:22:31 crc kubenswrapper[4721]: W0202 13:22:31.222980 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd51234ae_bf99_49bc_a3bc_1b392f993726.slice/crio-3241884a94bec9f4ab719e5306750fbea8f5c6d76bdd6847e89608ef5d722eca WatchSource:0}: Error finding container 3241884a94bec9f4ab719e5306750fbea8f5c6d76bdd6847e89608ef5d722eca: Status 404 returned error can't find the container with id 3241884a94bec9f4ab719e5306750fbea8f5c6d76bdd6847e89608ef5d722eca Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.656014 4721 generic.go:334] "Generic (PLEG): container finished" podID="a57cea33-806c-4028-b59f-9f5e65289eac" containerID="c824f81205ecd08e0b369ac8397beb509a4ac36dc83f526d99dfec02aa78a3a3" exitCode=0 Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.656116 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a57cea33-806c-4028-b59f-9f5e65289eac","Type":"ContainerDied","Data":"c824f81205ecd08e0b369ac8397beb509a4ac36dc83f526d99dfec02aa78a3a3"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.664663 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e588-account-create-update-4crm9" event={"ID":"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3","Type":"ContainerStarted","Data":"041fca898cdc3bff357d1c5b88b2d8189fb9511b9b78d7578e5238edecd243a2"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.664701 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e588-account-create-update-4crm9" event={"ID":"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3","Type":"ContainerStarted","Data":"5e42e28e547417761133e79bed203e5b34892ed4d47a98b44e202ab8a75a5f74"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.669194 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4msnh" event={"ID":"d51234ae-bf99-49bc-a3bc-1b392f993726","Type":"ContainerStarted","Data":"2e42426b62e4be73df9f47cc7c9f8475c22ad72a27299b42b9cb2460af93ca8b"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.669234 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4msnh" event={"ID":"d51234ae-bf99-49bc-a3bc-1b392f993726","Type":"ContainerStarted","Data":"3241884a94bec9f4ab719e5306750fbea8f5c6d76bdd6847e89608ef5d722eca"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.672383 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b945b" event={"ID":"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b","Type":"ContainerStarted","Data":"17ddfbe07f4d8bd38ac75c2dd4cd30a97224663bc355d41b2756657171654039"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.672425 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b945b" event={"ID":"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b","Type":"ContainerStarted","Data":"9e99bc7a71cde3a1726dccb420d69877ac39ece8a909aee37a1f367da53f1236"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.678944 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-679d56c757-8hcnt_1e715356-9848-439f-a13d-eb00f34521ec/console/0.log" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.679024 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-679d56c757-8hcnt" event={"ID":"1e715356-9848-439f-a13d-eb00f34521ec","Type":"ContainerDied","Data":"d99eec54d3234b8ee9ca1f4e6b988bce26f945deba80a7904e833116e7ebcfe1"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.679079 4721 scope.go:117] "RemoveContainer" containerID="2105e396598b1fd13640d6e576494052ae1ae901f42a2b1dd0e7f495d1eec506" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.679250 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-679d56c757-8hcnt" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.686836 4721 generic.go:334] "Generic (PLEG): container finished" podID="b6dbd607-3fa8-48e0-b420-4e939a47c460" containerID="ee3f7790a31c71e90ec0266b207ae1db8fc80457afd04cb12988c018efe7f723" exitCode=0 Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.686925 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"b6dbd607-3fa8-48e0-b420-4e939a47c460","Type":"ContainerDied","Data":"ee3f7790a31c71e90ec0266b207ae1db8fc80457afd04cb12988c018efe7f723"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.697894 4721 generic.go:334] "Generic (PLEG): container finished" podID="496bb19e-217b-4896-9bee-8082ac5da28b" containerID="77992fe4aaf2cf252e3f8b5179aa81c542c6cc143ebeb5bc6250cbc35937ab79" exitCode=0 Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.697985 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"496bb19e-217b-4896-9bee-8082ac5da28b","Type":"ContainerDied","Data":"77992fe4aaf2cf252e3f8b5179aa81c542c6cc143ebeb5bc6250cbc35937ab79"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.716454 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f5fd-account-create-update-8s8md" event={"ID":"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd","Type":"ContainerStarted","Data":"ce2892d88a036a25fa1c546c389b3e2c80e44cee3ffdb13ac0e6fe0ce93c414d"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.716550 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f5fd-account-create-update-8s8md" event={"ID":"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd","Type":"ContainerStarted","Data":"0a13c2c88a06c50133a2555869f297c8c47bf2195f08e760fef3c19438536755"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.755348 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4rnrx" event={"ID":"bd1f15d5-77dc-4b6d-81bf-c2a8286da820","Type":"ContainerStarted","Data":"6cc2ebc0d6497a1bebde8bd624dd9a320b42cd5a52feffbd34983448a9d2b4a3"} Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.785987 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-4msnh" podStartSLOduration=3.785963306 podStartE2EDuration="3.785963306s" podCreationTimestamp="2026-02-02 13:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:22:31.76653198 +0000 UTC m=+1292.069046369" watchObservedRunningTime="2026-02-02 13:22:31.785963306 +0000 UTC m=+1292.088477695" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.800270 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-e588-account-create-update-4crm9" podStartSLOduration=3.800251592 podStartE2EDuration="3.800251592s" podCreationTimestamp="2026-02-02 13:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:22:31.788502335 +0000 UTC m=+1292.091016734" watchObservedRunningTime="2026-02-02 13:22:31.800251592 +0000 UTC m=+1292.102765981" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.811582 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-b945b" podStartSLOduration=2.811562647 podStartE2EDuration="2.811562647s" podCreationTimestamp="2026-02-02 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:22:31.807888029 +0000 UTC m=+1292.110402428" watchObservedRunningTime="2026-02-02 13:22:31.811562647 +0000 UTC m=+1292.114077046" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.846600 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f5fd-account-create-update-8s8md" podStartSLOduration=2.846575564 podStartE2EDuration="2.846575564s" podCreationTimestamp="2026-02-02 13:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:22:31.842105183 +0000 UTC m=+1292.144619572" watchObservedRunningTime="2026-02-02 13:22:31.846575564 +0000 UTC m=+1292.149089953" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.894612 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-4rnrx" podStartSLOduration=5.215218677 podStartE2EDuration="8.894590983s" podCreationTimestamp="2026-02-02 13:22:23 +0000 UTC" firstStartedPulling="2026-02-02 13:22:26.720275419 +0000 UTC m=+1287.022789808" lastFinishedPulling="2026-02-02 13:22:30.399647725 +0000 UTC m=+1290.702162114" observedRunningTime="2026-02-02 13:22:31.888154849 +0000 UTC m=+1292.190669258" watchObservedRunningTime="2026-02-02 13:22:31.894590983 +0000 UTC m=+1292.197105382" Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.963357 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-679d56c757-8hcnt"] Feb 02 13:22:31 crc kubenswrapper[4721]: I0202 13:22:31.975512 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-679d56c757-8hcnt"] Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.420620 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e715356-9848-439f-a13d-eb00f34521ec" path="/var/lib/kubelet/pods/1e715356-9848-439f-a13d-eb00f34521ec/volumes" Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.730623 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vfbpx"] Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.741150 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vfbpx"] Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.782651 4721 generic.go:334] "Generic (PLEG): container finished" podID="e071a9e9-d1fa-41c2-a0b4-3ddc2470055b" containerID="17ddfbe07f4d8bd38ac75c2dd4cd30a97224663bc355d41b2756657171654039" exitCode=0 Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.782774 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b945b" event={"ID":"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b","Type":"ContainerDied","Data":"17ddfbe07f4d8bd38ac75c2dd4cd30a97224663bc355d41b2756657171654039"} Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.787997 4721 generic.go:334] "Generic (PLEG): container finished" podID="3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd" containerID="ce2892d88a036a25fa1c546c389b3e2c80e44cee3ffdb13ac0e6fe0ce93c414d" exitCode=0 Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.788051 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f5fd-account-create-update-8s8md" event={"ID":"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd","Type":"ContainerDied","Data":"ce2892d88a036a25fa1c546c389b3e2c80e44cee3ffdb13ac0e6fe0ce93c414d"} Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.790611 4721 generic.go:334] "Generic (PLEG): container finished" podID="4d21d961-1540-4610-89c0-ee265f66d728" containerID="562fa5c00d88c0f5830431f80588f0c092e7efe8f3354457564b72a3bf152ac5" exitCode=0 Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.790667 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4d21d961-1540-4610-89c0-ee265f66d728","Type":"ContainerDied","Data":"562fa5c00d88c0f5830431f80588f0c092e7efe8f3354457564b72a3bf152ac5"} Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.794566 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"496bb19e-217b-4896-9bee-8082ac5da28b","Type":"ContainerStarted","Data":"872bd030dc963253a68ebeb270b4c0a194dbd3eca5e9c121eb2ad50da1358c6e"} Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.794853 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.797868 4721 generic.go:334] "Generic (PLEG): container finished" podID="8261a2f3-c66a-441c-9fc6-a7a6a744b8a3" containerID="041fca898cdc3bff357d1c5b88b2d8189fb9511b9b78d7578e5238edecd243a2" exitCode=0 Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.798100 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e588-account-create-update-4crm9" event={"ID":"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3","Type":"ContainerDied","Data":"041fca898cdc3bff357d1c5b88b2d8189fb9511b9b78d7578e5238edecd243a2"} Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.800964 4721 generic.go:334] "Generic (PLEG): container finished" podID="d51234ae-bf99-49bc-a3bc-1b392f993726" containerID="2e42426b62e4be73df9f47cc7c9f8475c22ad72a27299b42b9cb2460af93ca8b" exitCode=0 Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.801021 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4msnh" event={"ID":"d51234ae-bf99-49bc-a3bc-1b392f993726","Type":"ContainerDied","Data":"2e42426b62e4be73df9f47cc7c9f8475c22ad72a27299b42b9cb2460af93ca8b"} Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.804491 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"b6dbd607-3fa8-48e0-b420-4e939a47c460","Type":"ContainerStarted","Data":"151f885add37f297ff13c58dc930184ef33697a0f39d1cd7915f4828c211c2f2"} Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.804712 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.809021 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a57cea33-806c-4028-b59f-9f5e65289eac","Type":"ContainerStarted","Data":"8c7f16314c0b8e5bcf446ede68e5cc86f001da0978108c61158b336f333031d2"} Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.809406 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.840588 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.570093647 podStartE2EDuration="58.840565298s" podCreationTimestamp="2026-02-02 13:21:34 +0000 UTC" firstStartedPulling="2026-02-02 13:21:54.331673318 +0000 UTC m=+1254.634187707" lastFinishedPulling="2026-02-02 13:21:57.602144969 +0000 UTC m=+1257.904659358" observedRunningTime="2026-02-02 13:22:32.830122286 +0000 UTC m=+1293.132636675" watchObservedRunningTime="2026-02-02 13:22:32.840565298 +0000 UTC m=+1293.143079707" Feb 02 13:22:32 crc kubenswrapper[4721]: I0202 13:22:32.965609 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.783106792 podStartE2EDuration="58.965591009s" podCreationTimestamp="2026-02-02 13:21:34 +0000 UTC" firstStartedPulling="2026-02-02 13:21:53.427296118 +0000 UTC m=+1253.729810507" lastFinishedPulling="2026-02-02 13:21:57.609780335 +0000 UTC m=+1257.912294724" observedRunningTime="2026-02-02 13:22:32.965377672 +0000 UTC m=+1293.267892081" watchObservedRunningTime="2026-02-02 13:22:32.965591009 +0000 UTC m=+1293.268105418" Feb 02 13:22:33 crc kubenswrapper[4721]: I0202 13:22:33.055894 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=55.738103501 podStartE2EDuration="59.055866949s" podCreationTimestamp="2026-02-02 13:21:34 +0000 UTC" firstStartedPulling="2026-02-02 13:21:54.283774794 +0000 UTC m=+1254.586289183" lastFinishedPulling="2026-02-02 13:21:57.601538242 +0000 UTC m=+1257.904052631" observedRunningTime="2026-02-02 13:22:33.013802902 +0000 UTC m=+1293.316317291" watchObservedRunningTime="2026-02-02 13:22:33.055866949 +0000 UTC m=+1293.358381348" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.434669 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c47d165e-de4e-4f3a-8f66-4dab149c7b5e" path="/var/lib/kubelet/pods/c47d165e-de4e-4f3a-8f66-4dab149c7b5e/volumes" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.489926 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.515412 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.580388 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.669768 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzx4c\" (UniqueName: \"kubernetes.io/projected/d51234ae-bf99-49bc-a3bc-1b392f993726-kube-api-access-gzx4c\") pod \"d51234ae-bf99-49bc-a3bc-1b392f993726\" (UID: \"d51234ae-bf99-49bc-a3bc-1b392f993726\") " Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.669969 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-operator-scripts\") pod \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\" (UID: \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\") " Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.670102 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpscl\" (UniqueName: \"kubernetes.io/projected/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-kube-api-access-kpscl\") pod \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\" (UID: \"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3\") " Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.670144 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d51234ae-bf99-49bc-a3bc-1b392f993726-operator-scripts\") pod \"d51234ae-bf99-49bc-a3bc-1b392f993726\" (UID: \"d51234ae-bf99-49bc-a3bc-1b392f993726\") " Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.671087 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d51234ae-bf99-49bc-a3bc-1b392f993726-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d51234ae-bf99-49bc-a3bc-1b392f993726" (UID: "d51234ae-bf99-49bc-a3bc-1b392f993726"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.671225 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8261a2f3-c66a-441c-9fc6-a7a6a744b8a3" (UID: "8261a2f3-c66a-441c-9fc6-a7a6a744b8a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.671591 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-operator-scripts\") pod \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\" (UID: \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\") " Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.672275 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd" (UID: "3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.674810 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-kube-api-access-kpscl" (OuterVolumeSpecName: "kube-api-access-kpscl") pod "8261a2f3-c66a-441c-9fc6-a7a6a744b8a3" (UID: "8261a2f3-c66a-441c-9fc6-a7a6a744b8a3"). InnerVolumeSpecName "kube-api-access-kpscl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.676484 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d51234ae-bf99-49bc-a3bc-1b392f993726-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.676515 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.676531 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.678136 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d51234ae-bf99-49bc-a3bc-1b392f993726-kube-api-access-gzx4c" (OuterVolumeSpecName: "kube-api-access-gzx4c") pod "d51234ae-bf99-49bc-a3bc-1b392f993726" (UID: "d51234ae-bf99-49bc-a3bc-1b392f993726"). InnerVolumeSpecName "kube-api-access-gzx4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.745886 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b945b" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.777324 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lfn7\" (UniqueName: \"kubernetes.io/projected/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-kube-api-access-8lfn7\") pod \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\" (UID: \"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd\") " Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.777404 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-operator-scripts\") pod \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\" (UID: \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\") " Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.777556 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmwlz\" (UniqueName: \"kubernetes.io/projected/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-kube-api-access-mmwlz\") pod \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\" (UID: \"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b\") " Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.778038 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpscl\" (UniqueName: \"kubernetes.io/projected/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3-kube-api-access-kpscl\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.778058 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzx4c\" (UniqueName: \"kubernetes.io/projected/d51234ae-bf99-49bc-a3bc-1b392f993726-kube-api-access-gzx4c\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.781799 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e071a9e9-d1fa-41c2-a0b4-3ddc2470055b" (UID: "e071a9e9-d1fa-41c2-a0b4-3ddc2470055b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.782273 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-kube-api-access-mmwlz" (OuterVolumeSpecName: "kube-api-access-mmwlz") pod "e071a9e9-d1fa-41c2-a0b4-3ddc2470055b" (UID: "e071a9e9-d1fa-41c2-a0b4-3ddc2470055b"). InnerVolumeSpecName "kube-api-access-mmwlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.787454 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-kube-api-access-8lfn7" (OuterVolumeSpecName: "kube-api-access-8lfn7") pod "3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd" (UID: "3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd"). InnerVolumeSpecName "kube-api-access-8lfn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.787676 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hrqtc"] Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788320 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d51234ae-bf99-49bc-a3bc-1b392f993726" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788347 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d51234ae-bf99-49bc-a3bc-1b392f993726" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788369 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1ef9e5-26ab-4b7b-b255-73968ed867ce" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788380 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1ef9e5-26ab-4b7b-b255-73968ed867ce" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788397 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b74e699-bc4f-4415-a9dc-8ad52d916bc0" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788405 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b74e699-bc4f-4415-a9dc-8ad52d916bc0" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788421 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e715356-9848-439f-a13d-eb00f34521ec" containerName="console" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788429 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e715356-9848-439f-a13d-eb00f34521ec" containerName="console" Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788451 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e72b39-6085-4753-8b7d-a93a80c95d49" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788475 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e72b39-6085-4753-8b7d-a93a80c95d49" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788490 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788498 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788515 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d46c6f8-aff0-4b28-a71b-d98a894afdaf" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788523 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d46c6f8-aff0-4b28-a71b-d98a894afdaf" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788533 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c47d165e-de4e-4f3a-8f66-4dab149c7b5e" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788540 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c47d165e-de4e-4f3a-8f66-4dab149c7b5e" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788555 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8261a2f3-c66a-441c-9fc6-a7a6a744b8a3" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788563 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8261a2f3-c66a-441c-9fc6-a7a6a744b8a3" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: E0202 13:22:34.788572 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e071a9e9-d1fa-41c2-a0b4-3ddc2470055b" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788581 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="e071a9e9-d1fa-41c2-a0b4-3ddc2470055b" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788817 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="8261a2f3-c66a-441c-9fc6-a7a6a744b8a3" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788833 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="e071a9e9-d1fa-41c2-a0b4-3ddc2470055b" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788844 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788864 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="d51234ae-bf99-49bc-a3bc-1b392f993726" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788876 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d46c6f8-aff0-4b28-a71b-d98a894afdaf" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788887 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e72b39-6085-4753-8b7d-a93a80c95d49" containerName="mariadb-database-create" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788904 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e715356-9848-439f-a13d-eb00f34521ec" containerName="console" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788917 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1ef9e5-26ab-4b7b-b255-73968ed867ce" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788928 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b74e699-bc4f-4415-a9dc-8ad52d916bc0" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.788941 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="c47d165e-de4e-4f3a-8f66-4dab149c7b5e" containerName="mariadb-account-create-update" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.789799 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.792265 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f7kg2" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.792540 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.838024 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hrqtc"] Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.854600 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"4d21d961-1540-4610-89c0-ee265f66d728","Type":"ContainerStarted","Data":"3daccf3c85c15ee03cbfeaf49005630c3a330eee6f6b7cb983a9433547468750"} Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.855244 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.860234 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e588-account-create-update-4crm9" event={"ID":"8261a2f3-c66a-441c-9fc6-a7a6a744b8a3","Type":"ContainerDied","Data":"5e42e28e547417761133e79bed203e5b34892ed4d47a98b44e202ab8a75a5f74"} Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.860276 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e42e28e547417761133e79bed203e5b34892ed4d47a98b44e202ab8a75a5f74" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.860341 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e588-account-create-update-4crm9" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.875962 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4msnh" event={"ID":"d51234ae-bf99-49bc-a3bc-1b392f993726","Type":"ContainerDied","Data":"3241884a94bec9f4ab719e5306750fbea8f5c6d76bdd6847e89608ef5d722eca"} Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.876005 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3241884a94bec9f4ab719e5306750fbea8f5c6d76bdd6847e89608ef5d722eca" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.876171 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4msnh" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.888123 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-db-sync-config-data\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.888326 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5rlj\" (UniqueName: \"kubernetes.io/projected/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-kube-api-access-s5rlj\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.888406 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-combined-ca-bundle\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.888605 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-config-data\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.888870 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lfn7\" (UniqueName: \"kubernetes.io/projected/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd-kube-api-access-8lfn7\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.888961 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.889076 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmwlz\" (UniqueName: \"kubernetes.io/projected/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b-kube-api-access-mmwlz\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.898439 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-b945b" event={"ID":"e071a9e9-d1fa-41c2-a0b4-3ddc2470055b","Type":"ContainerDied","Data":"9e99bc7a71cde3a1726dccb420d69877ac39ece8a909aee37a1f367da53f1236"} Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.898798 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-b945b" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.898827 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e99bc7a71cde3a1726dccb420d69877ac39ece8a909aee37a1f367da53f1236" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.919718 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f5fd-account-create-update-8s8md" event={"ID":"3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd","Type":"ContainerDied","Data":"0a13c2c88a06c50133a2555869f297c8c47bf2195f08e760fef3c19438536755"} Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.919767 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a13c2c88a06c50133a2555869f297c8c47bf2195f08e760fef3c19438536755" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.919874 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f5fd-account-create-update-8s8md" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.929446 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=55.762468398 podStartE2EDuration="1m0.929416572s" podCreationTimestamp="2026-02-02 13:21:34 +0000 UTC" firstStartedPulling="2026-02-02 13:21:52.432697638 +0000 UTC m=+1252.735212027" lastFinishedPulling="2026-02-02 13:21:57.599645812 +0000 UTC m=+1257.902160201" observedRunningTime="2026-02-02 13:22:34.906667977 +0000 UTC m=+1295.209182366" watchObservedRunningTime="2026-02-02 13:22:34.929416572 +0000 UTC m=+1295.231930961" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.992745 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5rlj\" (UniqueName: \"kubernetes.io/projected/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-kube-api-access-s5rlj\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.993135 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-combined-ca-bundle\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.993234 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-config-data\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:34 crc kubenswrapper[4721]: I0202 13:22:34.993327 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-db-sync-config-data\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:35 crc kubenswrapper[4721]: I0202 13:22:35.001238 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-db-sync-config-data\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:35 crc kubenswrapper[4721]: I0202 13:22:35.004399 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-config-data\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:35 crc kubenswrapper[4721]: I0202 13:22:35.011764 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-combined-ca-bundle\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:35 crc kubenswrapper[4721]: I0202 13:22:35.015721 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5rlj\" (UniqueName: \"kubernetes.io/projected/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-kube-api-access-s5rlj\") pod \"glance-db-sync-hrqtc\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:35 crc kubenswrapper[4721]: I0202 13:22:35.061731 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 02 13:22:35 crc kubenswrapper[4721]: I0202 13:22:35.124399 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hrqtc" Feb 02 13:22:35 crc kubenswrapper[4721]: I0202 13:22:35.783282 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hrqtc"] Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:35.931212 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hrqtc" event={"ID":"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d","Type":"ContainerStarted","Data":"bcc0541c75c4c63b75ba7d9003f7cf6d54e2e725d204aed88938ca8246ddf26a"} Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:35.936553 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerStarted","Data":"048ea46306d4b172a5e08114cafc091bb471ab145b3144b948269acd23158658"} Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:35.974708 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.474550418 podStartE2EDuration="54.974690863s" podCreationTimestamp="2026-02-02 13:21:41 +0000 UTC" firstStartedPulling="2026-02-02 13:21:54.269860517 +0000 UTC m=+1254.572374906" lastFinishedPulling="2026-02-02 13:22:34.770000962 +0000 UTC m=+1295.072515351" observedRunningTime="2026-02-02 13:22:35.959107432 +0000 UTC m=+1296.261621841" watchObservedRunningTime="2026-02-02 13:22:35.974690863 +0000 UTC m=+1296.277205252" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.151932 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-29h28"] Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.153368 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-29h28" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.160133 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.165960 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-29h28"] Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.325417 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-operator-scripts\") pod \"root-account-create-update-29h28\" (UID: \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\") " pod="openstack/root-account-create-update-29h28" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.325638 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q99nt\" (UniqueName: \"kubernetes.io/projected/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-kube-api-access-q99nt\") pod \"root-account-create-update-29h28\" (UID: \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\") " pod="openstack/root-account-create-update-29h28" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.427331 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q99nt\" (UniqueName: \"kubernetes.io/projected/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-kube-api-access-q99nt\") pod \"root-account-create-update-29h28\" (UID: \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\") " pod="openstack/root-account-create-update-29h28" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.427478 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-operator-scripts\") pod \"root-account-create-update-29h28\" (UID: \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\") " pod="openstack/root-account-create-update-29h28" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.428639 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-operator-scripts\") pod \"root-account-create-update-29h28\" (UID: \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\") " pod="openstack/root-account-create-update-29h28" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.492736 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q99nt\" (UniqueName: \"kubernetes.io/projected/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-kube-api-access-q99nt\") pod \"root-account-create-update-29h28\" (UID: \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\") " pod="openstack/root-account-create-update-29h28" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.708891 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5"] Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.710421 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.721012 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5"] Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.787015 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-29h28" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.838391 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5s7g\" (UniqueName: \"kubernetes.io/projected/5b1f70a8-6b41-4823-991b-934510a608fd-kube-api-access-g5s7g\") pod \"mysqld-exporter-openstack-cell1-db-create-4hfg5\" (UID: \"5b1f70a8-6b41-4823-991b-934510a608fd\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.838436 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1f70a8-6b41-4823-991b-934510a608fd-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-4hfg5\" (UID: \"5b1f70a8-6b41-4823-991b-934510a608fd\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.896258 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.938537 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-ed80-account-create-update-w8c4k"] Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.940464 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.941219 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5s7g\" (UniqueName: \"kubernetes.io/projected/5b1f70a8-6b41-4823-991b-934510a608fd-kube-api-access-g5s7g\") pod \"mysqld-exporter-openstack-cell1-db-create-4hfg5\" (UID: \"5b1f70a8-6b41-4823-991b-934510a608fd\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.941266 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1f70a8-6b41-4823-991b-934510a608fd-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-4hfg5\" (UID: \"5b1f70a8-6b41-4823-991b-934510a608fd\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.942851 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1f70a8-6b41-4823-991b-934510a608fd-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-4hfg5\" (UID: \"5b1f70a8-6b41-4823-991b-934510a608fd\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.950224 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.975452 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ed80-account-create-update-w8c4k"] Feb 02 13:22:36 crc kubenswrapper[4721]: I0202 13:22:36.980014 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5s7g\" (UniqueName: \"kubernetes.io/projected/5b1f70a8-6b41-4823-991b-934510a608fd-kube-api-access-g5s7g\") pod \"mysqld-exporter-openstack-cell1-db-create-4hfg5\" (UID: \"5b1f70a8-6b41-4823-991b-934510a608fd\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.023970 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9n8f"] Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.024300 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" podUID="02b496f0-c99d-43e9-9e8a-03286d8966ab" containerName="dnsmasq-dns" containerID="cri-o://6404bb6f1951c91af9f8453984cfaf480f6307bbda26d7f2b85d0b4cee4e2109" gracePeriod=10 Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.029357 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.043663 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af979c8-207f-455c-b383-fd22b1ec6758-operator-scripts\") pod \"mysqld-exporter-ed80-account-create-update-w8c4k\" (UID: \"0af979c8-207f-455c-b383-fd22b1ec6758\") " pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.043769 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt798\" (UniqueName: \"kubernetes.io/projected/0af979c8-207f-455c-b383-fd22b1ec6758-kube-api-access-nt798\") pod \"mysqld-exporter-ed80-account-create-update-w8c4k\" (UID: \"0af979c8-207f-455c-b383-fd22b1ec6758\") " pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.146385 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af979c8-207f-455c-b383-fd22b1ec6758-operator-scripts\") pod \"mysqld-exporter-ed80-account-create-update-w8c4k\" (UID: \"0af979c8-207f-455c-b383-fd22b1ec6758\") " pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.146514 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt798\" (UniqueName: \"kubernetes.io/projected/0af979c8-207f-455c-b383-fd22b1ec6758-kube-api-access-nt798\") pod \"mysqld-exporter-ed80-account-create-update-w8c4k\" (UID: \"0af979c8-207f-455c-b383-fd22b1ec6758\") " pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.148003 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af979c8-207f-455c-b383-fd22b1ec6758-operator-scripts\") pod \"mysqld-exporter-ed80-account-create-update-w8c4k\" (UID: \"0af979c8-207f-455c-b383-fd22b1ec6758\") " pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.174268 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt798\" (UniqueName: \"kubernetes.io/projected/0af979c8-207f-455c-b383-fd22b1ec6758-kube-api-access-nt798\") pod \"mysqld-exporter-ed80-account-create-update-w8c4k\" (UID: \"0af979c8-207f-455c-b383-fd22b1ec6758\") " pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.269991 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.662336 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-29h28"] Feb 02 13:22:37 crc kubenswrapper[4721]: W0202 13:22:37.690035 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca971a3a_e7fd_4f31_be3d_aff5722ad49f.slice/crio-5db7592d5d6daeb2d6d714fe913b4d15bf0ef1454ed3e3239598bd3ede182d5c WatchSource:0}: Error finding container 5db7592d5d6daeb2d6d714fe913b4d15bf0ef1454ed3e3239598bd3ede182d5c: Status 404 returned error can't find the container with id 5db7592d5d6daeb2d6d714fe913b4d15bf0ef1454ed3e3239598bd3ede182d5c Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.867500 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5"] Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.968780 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-29h28" event={"ID":"ca971a3a-e7fd-4f31-be3d-aff5722ad49f","Type":"ContainerStarted","Data":"5db7592d5d6daeb2d6d714fe913b4d15bf0ef1454ed3e3239598bd3ede182d5c"} Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.970126 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" event={"ID":"5b1f70a8-6b41-4823-991b-934510a608fd","Type":"ContainerStarted","Data":"71750a46c65416c25c5b66883a4f5da230a09a9166598a5e808e638dc3a967a8"} Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.973452 4721 generic.go:334] "Generic (PLEG): container finished" podID="02b496f0-c99d-43e9-9e8a-03286d8966ab" containerID="6404bb6f1951c91af9f8453984cfaf480f6307bbda26d7f2b85d0b4cee4e2109" exitCode=0 Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.973512 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" event={"ID":"02b496f0-c99d-43e9-9e8a-03286d8966ab","Type":"ContainerDied","Data":"6404bb6f1951c91af9f8453984cfaf480f6307bbda26d7f2b85d0b4cee4e2109"} Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.973536 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" event={"ID":"02b496f0-c99d-43e9-9e8a-03286d8966ab","Type":"ContainerDied","Data":"d756b18dd24d6e38b37f9d5682fc7973ac79057046e2526e8afd7ae562cccd1e"} Feb 02 13:22:37 crc kubenswrapper[4721]: I0202 13:22:37.973549 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d756b18dd24d6e38b37f9d5682fc7973ac79057046e2526e8afd7ae562cccd1e" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.015800 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.075745 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ed80-account-create-update-w8c4k"] Feb 02 13:22:38 crc kubenswrapper[4721]: W0202 13:22:38.086009 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0af979c8_207f_455c_b383_fd22b1ec6758.slice/crio-337291d05bad9f4bf913d361a804dc119df0d75bc05d5d9d53f3055d24ce4bd2 WatchSource:0}: Error finding container 337291d05bad9f4bf913d361a804dc119df0d75bc05d5d9d53f3055d24ce4bd2: Status 404 returned error can't find the container with id 337291d05bad9f4bf913d361a804dc119df0d75bc05d5d9d53f3055d24ce4bd2 Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.111520 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.179558 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-config\") pod \"02b496f0-c99d-43e9-9e8a-03286d8966ab\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.179631 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-dns-svc\") pod \"02b496f0-c99d-43e9-9e8a-03286d8966ab\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.179842 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj6v4\" (UniqueName: \"kubernetes.io/projected/02b496f0-c99d-43e9-9e8a-03286d8966ab-kube-api-access-dj6v4\") pod \"02b496f0-c99d-43e9-9e8a-03286d8966ab\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.179902 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-nb\") pod \"02b496f0-c99d-43e9-9e8a-03286d8966ab\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.180032 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-sb\") pod \"02b496f0-c99d-43e9-9e8a-03286d8966ab\" (UID: \"02b496f0-c99d-43e9-9e8a-03286d8966ab\") " Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.199291 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b496f0-c99d-43e9-9e8a-03286d8966ab-kube-api-access-dj6v4" (OuterVolumeSpecName: "kube-api-access-dj6v4") pod "02b496f0-c99d-43e9-9e8a-03286d8966ab" (UID: "02b496f0-c99d-43e9-9e8a-03286d8966ab"). InnerVolumeSpecName "kube-api-access-dj6v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.286049 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj6v4\" (UniqueName: \"kubernetes.io/projected/02b496f0-c99d-43e9-9e8a-03286d8966ab-kube-api-access-dj6v4\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.296916 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02b496f0-c99d-43e9-9e8a-03286d8966ab" (UID: "02b496f0-c99d-43e9-9e8a-03286d8966ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.306893 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "02b496f0-c99d-43e9-9e8a-03286d8966ab" (UID: "02b496f0-c99d-43e9-9e8a-03286d8966ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.326624 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-config" (OuterVolumeSpecName: "config") pod "02b496f0-c99d-43e9-9e8a-03286d8966ab" (UID: "02b496f0-c99d-43e9-9e8a-03286d8966ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.335629 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "02b496f0-c99d-43e9-9e8a-03286d8966ab" (UID: "02b496f0-c99d-43e9-9e8a-03286d8966ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.388568 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.388614 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.388626 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.388636 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b496f0-c99d-43e9-9e8a-03286d8966ab-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.490531 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:38 crc kubenswrapper[4721]: E0202 13:22:38.490799 4721 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 13:22:38 crc kubenswrapper[4721]: E0202 13:22:38.490833 4721 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 13:22:38 crc kubenswrapper[4721]: E0202 13:22:38.490902 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift podName:eabe6b07-da9d-4980-99b4-12c02640c88d nodeName:}" failed. No retries permitted until 2026-02-02 13:22:54.490881071 +0000 UTC m=+1314.793395460 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift") pod "swift-storage-0" (UID: "eabe6b07-da9d-4980-99b4-12c02640c88d") : configmap "swift-ring-files" not found Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.986927 4721 generic.go:334] "Generic (PLEG): container finished" podID="5b1f70a8-6b41-4823-991b-934510a608fd" containerID="786d3f2324d84717b8e858d54067e88e6ae7b7e91f7a2cafa3e176317085dd6f" exitCode=0 Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.987303 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" event={"ID":"5b1f70a8-6b41-4823-991b-934510a608fd","Type":"ContainerDied","Data":"786d3f2324d84717b8e858d54067e88e6ae7b7e91f7a2cafa3e176317085dd6f"} Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.993398 4721 generic.go:334] "Generic (PLEG): container finished" podID="0af979c8-207f-455c-b383-fd22b1ec6758" containerID="c6d0cc979d5c7bfcd7c17e38f66db5aa66eb2098b9d5dff2ff5da7fb49088c43" exitCode=0 Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.993532 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" event={"ID":"0af979c8-207f-455c-b383-fd22b1ec6758","Type":"ContainerDied","Data":"c6d0cc979d5c7bfcd7c17e38f66db5aa66eb2098b9d5dff2ff5da7fb49088c43"} Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.993561 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" event={"ID":"0af979c8-207f-455c-b383-fd22b1ec6758","Type":"ContainerStarted","Data":"337291d05bad9f4bf913d361a804dc119df0d75bc05d5d9d53f3055d24ce4bd2"} Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.995595 4721 generic.go:334] "Generic (PLEG): container finished" podID="ca971a3a-e7fd-4f31-be3d-aff5722ad49f" containerID="74a99b13280ba5e058fb97f392a9a2baa22e1224fb962f08950b59d7a1606135" exitCode=0 Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.995667 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9n8f" Feb 02 13:22:38 crc kubenswrapper[4721]: I0202 13:22:38.995801 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-29h28" event={"ID":"ca971a3a-e7fd-4f31-be3d-aff5722ad49f","Type":"ContainerDied","Data":"74a99b13280ba5e058fb97f392a9a2baa22e1224fb962f08950b59d7a1606135"} Feb 02 13:22:39 crc kubenswrapper[4721]: I0202 13:22:39.051720 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9n8f"] Feb 02 13:22:39 crc kubenswrapper[4721]: I0202 13:22:39.060083 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9n8f"] Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.007561 4721 generic.go:334] "Generic (PLEG): container finished" podID="bd1f15d5-77dc-4b6d-81bf-c2a8286da820" containerID="6cc2ebc0d6497a1bebde8bd624dd9a320b42cd5a52feffbd34983448a9d2b4a3" exitCode=0 Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.007687 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4rnrx" event={"ID":"bd1f15d5-77dc-4b6d-81bf-c2a8286da820","Type":"ContainerDied","Data":"6cc2ebc0d6497a1bebde8bd624dd9a320b42cd5a52feffbd34983448a9d2b4a3"} Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.459567 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02b496f0-c99d-43e9-9e8a-03286d8966ab" path="/var/lib/kubelet/pods/02b496f0-c99d-43e9-9e8a-03286d8966ab/volumes" Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.833302 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-29h28" Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.838990 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.850059 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.946219 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1f70a8-6b41-4823-991b-934510a608fd-operator-scripts\") pod \"5b1f70a8-6b41-4823-991b-934510a608fd\" (UID: \"5b1f70a8-6b41-4823-991b-934510a608fd\") " Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.946273 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af979c8-207f-455c-b383-fd22b1ec6758-operator-scripts\") pod \"0af979c8-207f-455c-b383-fd22b1ec6758\" (UID: \"0af979c8-207f-455c-b383-fd22b1ec6758\") " Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.946422 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5s7g\" (UniqueName: \"kubernetes.io/projected/5b1f70a8-6b41-4823-991b-934510a608fd-kube-api-access-g5s7g\") pod \"5b1f70a8-6b41-4823-991b-934510a608fd\" (UID: \"5b1f70a8-6b41-4823-991b-934510a608fd\") " Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.946473 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-operator-scripts\") pod \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\" (UID: \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\") " Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.946499 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q99nt\" (UniqueName: \"kubernetes.io/projected/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-kube-api-access-q99nt\") pod \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\" (UID: \"ca971a3a-e7fd-4f31-be3d-aff5722ad49f\") " Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.946521 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt798\" (UniqueName: \"kubernetes.io/projected/0af979c8-207f-455c-b383-fd22b1ec6758-kube-api-access-nt798\") pod \"0af979c8-207f-455c-b383-fd22b1ec6758\" (UID: \"0af979c8-207f-455c-b383-fd22b1ec6758\") " Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.947266 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca971a3a-e7fd-4f31-be3d-aff5722ad49f" (UID: "ca971a3a-e7fd-4f31-be3d-aff5722ad49f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.947376 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b1f70a8-6b41-4823-991b-934510a608fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b1f70a8-6b41-4823-991b-934510a608fd" (UID: "5b1f70a8-6b41-4823-991b-934510a608fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.947399 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af979c8-207f-455c-b383-fd22b1ec6758-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0af979c8-207f-455c-b383-fd22b1ec6758" (UID: "0af979c8-207f-455c-b383-fd22b1ec6758"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.976737 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af979c8-207f-455c-b383-fd22b1ec6758-kube-api-access-nt798" (OuterVolumeSpecName: "kube-api-access-nt798") pod "0af979c8-207f-455c-b383-fd22b1ec6758" (UID: "0af979c8-207f-455c-b383-fd22b1ec6758"). InnerVolumeSpecName "kube-api-access-nt798". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.977021 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-kube-api-access-q99nt" (OuterVolumeSpecName: "kube-api-access-q99nt") pod "ca971a3a-e7fd-4f31-be3d-aff5722ad49f" (UID: "ca971a3a-e7fd-4f31-be3d-aff5722ad49f"). InnerVolumeSpecName "kube-api-access-q99nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:40 crc kubenswrapper[4721]: I0202 13:22:40.978254 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b1f70a8-6b41-4823-991b-934510a608fd-kube-api-access-g5s7g" (OuterVolumeSpecName: "kube-api-access-g5s7g") pod "5b1f70a8-6b41-4823-991b-934510a608fd" (UID: "5b1f70a8-6b41-4823-991b-934510a608fd"). InnerVolumeSpecName "kube-api-access-g5s7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.022256 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.024597 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5" event={"ID":"5b1f70a8-6b41-4823-991b-934510a608fd","Type":"ContainerDied","Data":"71750a46c65416c25c5b66883a4f5da230a09a9166598a5e808e638dc3a967a8"} Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.024645 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71750a46c65416c25c5b66883a4f5da230a09a9166598a5e808e638dc3a967a8" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.033923 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" event={"ID":"0af979c8-207f-455c-b383-fd22b1ec6758","Type":"ContainerDied","Data":"337291d05bad9f4bf913d361a804dc119df0d75bc05d5d9d53f3055d24ce4bd2"} Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.033971 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="337291d05bad9f4bf913d361a804dc119df0d75bc05d5d9d53f3055d24ce4bd2" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.034121 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ed80-account-create-update-w8c4k" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.050214 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5s7g\" (UniqueName: \"kubernetes.io/projected/5b1f70a8-6b41-4823-991b-934510a608fd-kube-api-access-g5s7g\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.050240 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.050251 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q99nt\" (UniqueName: \"kubernetes.io/projected/ca971a3a-e7fd-4f31-be3d-aff5722ad49f-kube-api-access-q99nt\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.050259 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt798\" (UniqueName: \"kubernetes.io/projected/0af979c8-207f-455c-b383-fd22b1ec6758-kube-api-access-nt798\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.050270 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b1f70a8-6b41-4823-991b-934510a608fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.050279 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0af979c8-207f-455c-b383-fd22b1ec6758-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.050476 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-29h28" event={"ID":"ca971a3a-e7fd-4f31-be3d-aff5722ad49f","Type":"ContainerDied","Data":"5db7592d5d6daeb2d6d714fe913b4d15bf0ef1454ed3e3239598bd3ede182d5c"} Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.050515 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db7592d5d6daeb2d6d714fe913b4d15bf0ef1454ed3e3239598bd3ede182d5c" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.050544 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-29h28" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.074961 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-l5h78" podUID="298ac2ef-6edb-40cb-bb92-8a8e039f333b" containerName="ovn-controller" probeResult="failure" output=< Feb 02 13:22:41 crc kubenswrapper[4721]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 13:22:41 crc kubenswrapper[4721]: > Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.439038 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.560409 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-dispersionconf\") pod \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.560752 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5rm5\" (UniqueName: \"kubernetes.io/projected/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-kube-api-access-q5rm5\") pod \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.560862 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-scripts\") pod \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.560946 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-ring-data-devices\") pod \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.561014 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-swiftconf\") pod \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.561111 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-etc-swift\") pod \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.561165 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-combined-ca-bundle\") pod \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\" (UID: \"bd1f15d5-77dc-4b6d-81bf-c2a8286da820\") " Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.565268 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bd1f15d5-77dc-4b6d-81bf-c2a8286da820" (UID: "bd1f15d5-77dc-4b6d-81bf-c2a8286da820"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.565891 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bd1f15d5-77dc-4b6d-81bf-c2a8286da820" (UID: "bd1f15d5-77dc-4b6d-81bf-c2a8286da820"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.568834 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-kube-api-access-q5rm5" (OuterVolumeSpecName: "kube-api-access-q5rm5") pod "bd1f15d5-77dc-4b6d-81bf-c2a8286da820" (UID: "bd1f15d5-77dc-4b6d-81bf-c2a8286da820"). InnerVolumeSpecName "kube-api-access-q5rm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.570369 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bd1f15d5-77dc-4b6d-81bf-c2a8286da820" (UID: "bd1f15d5-77dc-4b6d-81bf-c2a8286da820"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.594962 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd1f15d5-77dc-4b6d-81bf-c2a8286da820" (UID: "bd1f15d5-77dc-4b6d-81bf-c2a8286da820"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.598737 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bd1f15d5-77dc-4b6d-81bf-c2a8286da820" (UID: "bd1f15d5-77dc-4b6d-81bf-c2a8286da820"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.601237 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-scripts" (OuterVolumeSpecName: "scripts") pod "bd1f15d5-77dc-4b6d-81bf-c2a8286da820" (UID: "bd1f15d5-77dc-4b6d-81bf-c2a8286da820"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.663608 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.663653 4721 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.663669 4721 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.663683 4721 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.663695 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.663708 4721 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:41 crc kubenswrapper[4721]: I0202 13:22:41.663722 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5rm5\" (UniqueName: \"kubernetes.io/projected/bd1f15d5-77dc-4b6d-81bf-c2a8286da820-kube-api-access-q5rm5\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:42 crc kubenswrapper[4721]: I0202 13:22:42.080574 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4rnrx" event={"ID":"bd1f15d5-77dc-4b6d-81bf-c2a8286da820","Type":"ContainerDied","Data":"75aba502874704cff954aba43521e9024690a740951c41d1966d4ffb33d85812"} Feb 02 13:22:42 crc kubenswrapper[4721]: I0202 13:22:42.081272 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75aba502874704cff954aba43521e9024690a740951c41d1966d4ffb33d85812" Feb 02 13:22:42 crc kubenswrapper[4721]: I0202 13:22:42.081278 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4rnrx" Feb 02 13:22:42 crc kubenswrapper[4721]: I0202 13:22:42.794944 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-29h28"] Feb 02 13:22:42 crc kubenswrapper[4721]: I0202 13:22:42.806755 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-29h28"] Feb 02 13:22:43 crc kubenswrapper[4721]: I0202 13:22:43.111726 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:43 crc kubenswrapper[4721]: I0202 13:22:43.114393 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:44 crc kubenswrapper[4721]: I0202 13:22:44.102666 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:44 crc kubenswrapper[4721]: I0202 13:22:44.421509 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca971a3a-e7fd-4f31-be3d-aff5722ad49f" path="/var/lib/kubelet/pods/ca971a3a-e7fd-4f31-be3d-aff5722ad49f/volumes" Feb 02 13:22:45 crc kubenswrapper[4721]: I0202 13:22:45.827511 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="a57cea33-806c-4028-b59f-9f5e65289eac" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Feb 02 13:22:45 crc kubenswrapper[4721]: I0202 13:22:45.986675 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-l5h78" podUID="298ac2ef-6edb-40cb-bb92-8a8e039f333b" containerName="ovn-controller" probeResult="failure" output=< Feb 02 13:22:45 crc kubenswrapper[4721]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 13:22:45 crc kubenswrapper[4721]: > Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.011544 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.024761 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gz9nz" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.177922 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="4d21d961-1540-4610-89c0-ee265f66d728" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.131:5671: connect: connection refused" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.178883 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="b6dbd607-3fa8-48e0-b420-4e939a47c460" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: connect: connection refused" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.190993 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-nftsl"] Feb 02 13:22:46 crc kubenswrapper[4721]: E0202 13:22:46.191719 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b496f0-c99d-43e9-9e8a-03286d8966ab" containerName="init" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.191817 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b496f0-c99d-43e9-9e8a-03286d8966ab" containerName="init" Feb 02 13:22:46 crc kubenswrapper[4721]: E0202 13:22:46.192664 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b496f0-c99d-43e9-9e8a-03286d8966ab" containerName="dnsmasq-dns" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.192765 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b496f0-c99d-43e9-9e8a-03286d8966ab" containerName="dnsmasq-dns" Feb 02 13:22:46 crc kubenswrapper[4721]: E0202 13:22:46.192846 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b1f70a8-6b41-4823-991b-934510a608fd" containerName="mariadb-database-create" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.192930 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b1f70a8-6b41-4823-991b-934510a608fd" containerName="mariadb-database-create" Feb 02 13:22:46 crc kubenswrapper[4721]: E0202 13:22:46.193022 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd1f15d5-77dc-4b6d-81bf-c2a8286da820" containerName="swift-ring-rebalance" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.193131 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd1f15d5-77dc-4b6d-81bf-c2a8286da820" containerName="swift-ring-rebalance" Feb 02 13:22:46 crc kubenswrapper[4721]: E0202 13:22:46.193235 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af979c8-207f-455c-b383-fd22b1ec6758" containerName="mariadb-account-create-update" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.193313 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af979c8-207f-455c-b383-fd22b1ec6758" containerName="mariadb-account-create-update" Feb 02 13:22:46 crc kubenswrapper[4721]: E0202 13:22:46.193413 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca971a3a-e7fd-4f31-be3d-aff5722ad49f" containerName="mariadb-account-create-update" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.193489 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca971a3a-e7fd-4f31-be3d-aff5722ad49f" containerName="mariadb-account-create-update" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.193905 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af979c8-207f-455c-b383-fd22b1ec6758" containerName="mariadb-account-create-update" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.194008 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="02b496f0-c99d-43e9-9e8a-03286d8966ab" containerName="dnsmasq-dns" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.194113 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd1f15d5-77dc-4b6d-81bf-c2a8286da820" containerName="swift-ring-rebalance" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.194215 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b1f70a8-6b41-4823-991b-934510a608fd" containerName="mariadb-database-create" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.194292 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca971a3a-e7fd-4f31-be3d-aff5722ad49f" containerName="mariadb-account-create-update" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.195387 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.198466 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.212754 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nftsl"] Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.259993 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-l5h78-config-5bmh9"] Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.262098 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.264670 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.270884 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l5h78-config-5bmh9"] Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.274316 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d5ab80d-40d3-4259-a13a-efb59d66b725-operator-scripts\") pod \"root-account-create-update-nftsl\" (UID: \"3d5ab80d-40d3-4259-a13a-efb59d66b725\") " pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.274509 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpl4j\" (UniqueName: \"kubernetes.io/projected/3d5ab80d-40d3-4259-a13a-efb59d66b725-kube-api-access-fpl4j\") pod \"root-account-create-update-nftsl\" (UID: \"3d5ab80d-40d3-4259-a13a-efb59d66b725\") " pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.376018 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.376128 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-additional-scripts\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.376153 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-scripts\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.376250 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpl4j\" (UniqueName: \"kubernetes.io/projected/3d5ab80d-40d3-4259-a13a-efb59d66b725-kube-api-access-fpl4j\") pod \"root-account-create-update-nftsl\" (UID: \"3d5ab80d-40d3-4259-a13a-efb59d66b725\") " pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.376282 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-log-ovn\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.376347 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run-ovn\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.376386 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8xf8\" (UniqueName: \"kubernetes.io/projected/ffb2d2fc-d882-4343-a182-5d4cae12692f-kube-api-access-t8xf8\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.376510 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d5ab80d-40d3-4259-a13a-efb59d66b725-operator-scripts\") pod \"root-account-create-update-nftsl\" (UID: \"3d5ab80d-40d3-4259-a13a-efb59d66b725\") " pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.377419 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d5ab80d-40d3-4259-a13a-efb59d66b725-operator-scripts\") pod \"root-account-create-update-nftsl\" (UID: \"3d5ab80d-40d3-4259-a13a-efb59d66b725\") " pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.397397 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpl4j\" (UniqueName: \"kubernetes.io/projected/3d5ab80d-40d3-4259-a13a-efb59d66b725-kube-api-access-fpl4j\") pod \"root-account-create-update-nftsl\" (UID: \"3d5ab80d-40d3-4259-a13a-efb59d66b725\") " pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.478401 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run-ovn\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.478490 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8xf8\" (UniqueName: \"kubernetes.io/projected/ffb2d2fc-d882-4343-a182-5d4cae12692f-kube-api-access-t8xf8\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.478595 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.478660 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-additional-scripts\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.478685 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-scripts\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.478744 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-log-ovn\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.478857 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run-ovn\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.478897 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-log-ovn\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.478950 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.479659 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-additional-scripts\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.480874 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-scripts\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.497554 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8xf8\" (UniqueName: \"kubernetes.io/projected/ffb2d2fc-d882-4343-a182-5d4cae12692f-kube-api-access-t8xf8\") pod \"ovn-controller-l5h78-config-5bmh9\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.525491 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.567291 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.596645 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.859243 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.860668 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.862705 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.873079 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.990090 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jz54\" (UniqueName: \"kubernetes.io/projected/7a6930c7-1819-4b7d-baf6-773a8b68e568-kube-api-access-8jz54\") pod \"mysqld-exporter-0\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " pod="openstack/mysqld-exporter-0" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.990212 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-config-data\") pod \"mysqld-exporter-0\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " pod="openstack/mysqld-exporter-0" Feb 02 13:22:46 crc kubenswrapper[4721]: I0202 13:22:46.990263 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " pod="openstack/mysqld-exporter-0" Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.092715 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jz54\" (UniqueName: \"kubernetes.io/projected/7a6930c7-1819-4b7d-baf6-773a8b68e568-kube-api-access-8jz54\") pod \"mysqld-exporter-0\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " pod="openstack/mysqld-exporter-0" Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.092773 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-config-data\") pod \"mysqld-exporter-0\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " pod="openstack/mysqld-exporter-0" Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.092909 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " pod="openstack/mysqld-exporter-0" Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.097579 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-config-data\") pod \"mysqld-exporter-0\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " pod="openstack/mysqld-exporter-0" Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.117797 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " pod="openstack/mysqld-exporter-0" Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.123838 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jz54\" (UniqueName: \"kubernetes.io/projected/7a6930c7-1819-4b7d-baf6-773a8b68e568-kube-api-access-8jz54\") pod \"mysqld-exporter-0\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " pod="openstack/mysqld-exporter-0" Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.163987 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.164364 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="prometheus" containerID="cri-o://ee494f7d068c5f80477a3c22c3a31526d01c4a02a690967dc4ac1b911a158a98" gracePeriod=600 Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.164505 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="config-reloader" containerID="cri-o://cac1e1e212591677852178e9111f4a55b1b7ae8d7ed3f2354152657d155f18e1" gracePeriod=600 Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.164516 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="thanos-sidecar" containerID="cri-o://048ea46306d4b172a5e08114cafc091bb471ab145b3144b948269acd23158658" gracePeriod=600 Feb 02 13:22:47 crc kubenswrapper[4721]: I0202 13:22:47.187053 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 13:22:48 crc kubenswrapper[4721]: I0202 13:22:48.130785 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.139:9090/-/ready\": dial tcp 10.217.0.139:9090: connect: connection refused" Feb 02 13:22:48 crc kubenswrapper[4721]: I0202 13:22:48.155279 4721 generic.go:334] "Generic (PLEG): container finished" podID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerID="048ea46306d4b172a5e08114cafc091bb471ab145b3144b948269acd23158658" exitCode=0 Feb 02 13:22:48 crc kubenswrapper[4721]: I0202 13:22:48.155316 4721 generic.go:334] "Generic (PLEG): container finished" podID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerID="cac1e1e212591677852178e9111f4a55b1b7ae8d7ed3f2354152657d155f18e1" exitCode=0 Feb 02 13:22:48 crc kubenswrapper[4721]: I0202 13:22:48.155324 4721 generic.go:334] "Generic (PLEG): container finished" podID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerID="ee494f7d068c5f80477a3c22c3a31526d01c4a02a690967dc4ac1b911a158a98" exitCode=0 Feb 02 13:22:48 crc kubenswrapper[4721]: I0202 13:22:48.155349 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerDied","Data":"048ea46306d4b172a5e08114cafc091bb471ab145b3144b948269acd23158658"} Feb 02 13:22:48 crc kubenswrapper[4721]: I0202 13:22:48.155378 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerDied","Data":"cac1e1e212591677852178e9111f4a55b1b7ae8d7ed3f2354152657d155f18e1"} Feb 02 13:22:48 crc kubenswrapper[4721]: I0202 13:22:48.155393 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerDied","Data":"ee494f7d068c5f80477a3c22c3a31526d01c4a02a690967dc4ac1b911a158a98"} Feb 02 13:22:50 crc kubenswrapper[4721]: I0202 13:22:50.981884 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-l5h78" podUID="298ac2ef-6edb-40cb-bb92-8a8e039f333b" containerName="ovn-controller" probeResult="failure" output=< Feb 02 13:22:50 crc kubenswrapper[4721]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 13:22:50 crc kubenswrapper[4721]: > Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.112772 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.139:9090/-/ready\": dial tcp 10.217.0.139:9090: connect: connection refused" Feb 02 13:22:53 crc kubenswrapper[4721]: E0202 13:22:53.232890 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Feb 02 13:22:53 crc kubenswrapper[4721]: E0202 13:22:53.233156 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s5rlj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-hrqtc_openstack(0531b398-2d44-42c2-bd6c-9e9f7ab8c85d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:22:53 crc kubenswrapper[4721]: E0202 13:22:53.234444 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-hrqtc" podUID="0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.351878 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l5h78-config-5bmh9"] Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.388016 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-nftsl"] Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.597754 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.603764 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.615509 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.675557 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-0\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.675933 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-2\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.676090 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-web-config\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.676243 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.676269 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq5th\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-kube-api-access-nq5th\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.676295 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-thanos-prometheus-http-client-file\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.676311 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-config\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.676338 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-tls-assets\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.676386 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/754114a2-a012-43fe-923b-a8cc3df91aa0-config-out\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.676411 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-1\") pod \"754114a2-a012-43fe-923b-a8cc3df91aa0\" (UID: \"754114a2-a012-43fe-923b-a8cc3df91aa0\") " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.676788 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.677818 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.679440 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.683448 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-config" (OuterVolumeSpecName: "config") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.683913 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/754114a2-a012-43fe-923b-a8cc3df91aa0-config-out" (OuterVolumeSpecName: "config-out") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.685011 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.685118 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-kube-api-access-nq5th" (OuterVolumeSpecName: "kube-api-access-nq5th") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "kube-api-access-nq5th". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.693437 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.721770 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.729321 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-web-config" (OuterVolumeSpecName: "web-config") pod "754114a2-a012-43fe-923b-a8cc3df91aa0" (UID: "754114a2-a012-43fe-923b-a8cc3df91aa0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779272 4721 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779304 4721 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-web-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779337 4721 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") on node \"crc\" " Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779349 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq5th\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-kube-api-access-nq5th\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779359 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779369 4721 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/754114a2-a012-43fe-923b-a8cc3df91aa0-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779378 4721 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/754114a2-a012-43fe-923b-a8cc3df91aa0-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779388 4721 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/754114a2-a012-43fe-923b-a8cc3df91aa0-config-out\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779398 4721 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.779407 4721 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/754114a2-a012-43fe-923b-a8cc3df91aa0-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.814749 4721 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.814944 4721 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3") on node "crc" Feb 02 13:22:53 crc kubenswrapper[4721]: I0202 13:22:53.881788 4721 reconciler_common.go:293] "Volume detached for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.231134 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"754114a2-a012-43fe-923b-a8cc3df91aa0","Type":"ContainerDied","Data":"1ae72f1d8842f29156e47082662e67e943468c1763e7bc830d0036db2470a455"} Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.231229 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.232366 4721 scope.go:117] "RemoveContainer" containerID="048ea46306d4b172a5e08114cafc091bb471ab145b3144b948269acd23158658" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.232835 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7a6930c7-1819-4b7d-baf6-773a8b68e568","Type":"ContainerStarted","Data":"29066421e6cce726a66f30e6952c937493f0f81dbe0ff9779f6c880b60322c1e"} Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.239661 4721 generic.go:334] "Generic (PLEG): container finished" podID="ffb2d2fc-d882-4343-a182-5d4cae12692f" containerID="80fd8d4c523d759364a503f4e0957e812daf0265c52b4e63f92c47e96ac7e275" exitCode=0 Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.239779 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l5h78-config-5bmh9" event={"ID":"ffb2d2fc-d882-4343-a182-5d4cae12692f","Type":"ContainerDied","Data":"80fd8d4c523d759364a503f4e0957e812daf0265c52b4e63f92c47e96ac7e275"} Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.239815 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l5h78-config-5bmh9" event={"ID":"ffb2d2fc-d882-4343-a182-5d4cae12692f","Type":"ContainerStarted","Data":"834db9ab0bc9ea710cdeeae1fdd49ed39cc3b00bb16ebc533c33cc39366d14fc"} Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.242128 4721 generic.go:334] "Generic (PLEG): container finished" podID="3d5ab80d-40d3-4259-a13a-efb59d66b725" containerID="a4049e92c383c0eb65178e6eb81956222b0a85112475c87886800360320c1322" exitCode=0 Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.242203 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nftsl" event={"ID":"3d5ab80d-40d3-4259-a13a-efb59d66b725","Type":"ContainerDied","Data":"a4049e92c383c0eb65178e6eb81956222b0a85112475c87886800360320c1322"} Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.242226 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nftsl" event={"ID":"3d5ab80d-40d3-4259-a13a-efb59d66b725","Type":"ContainerStarted","Data":"28498d7488cd8662de9104fba2a52e44054093c2b223f3e35ab776768caaf308"} Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.259654 4721 scope.go:117] "RemoveContainer" containerID="cac1e1e212591677852178e9111f4a55b1b7ae8d7ed3f2354152657d155f18e1" Feb 02 13:22:54 crc kubenswrapper[4721]: E0202 13:22:54.259888 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-hrqtc" podUID="0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.304724 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.320238 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.323009 4721 scope.go:117] "RemoveContainer" containerID="ee494f7d068c5f80477a3c22c3a31526d01c4a02a690967dc4ac1b911a158a98" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.350371 4721 scope.go:117] "RemoveContainer" containerID="f51717e20215c350c32f4c0b374e223fc4a90aedd712f943c01301452ac10dc6" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.352285 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 13:22:54 crc kubenswrapper[4721]: E0202 13:22:54.360685 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="init-config-reloader" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.360752 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="init-config-reloader" Feb 02 13:22:54 crc kubenswrapper[4721]: E0202 13:22:54.360849 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="config-reloader" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.360859 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="config-reloader" Feb 02 13:22:54 crc kubenswrapper[4721]: E0202 13:22:54.360874 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="thanos-sidecar" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.360970 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="thanos-sidecar" Feb 02 13:22:54 crc kubenswrapper[4721]: E0202 13:22:54.361020 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="prometheus" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.361029 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="prometheus" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.361500 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="prometheus" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.361517 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="thanos-sidecar" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.361534 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" containerName="config-reloader" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.365673 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.367953 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xt8ls" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.373403 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.373406 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.373744 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.374318 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.374414 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.375781 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.383406 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.385116 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.388243 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.427824 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="754114a2-a012-43fe-923b-a8cc3df91aa0" path="/var/lib/kubelet/pods/754114a2-a012-43fe-923b-a8cc3df91aa0/volumes" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500044 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500182 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-config\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500232 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6a34d077-087f-4b04-98c5-22e09450dcb3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500318 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500356 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6a34d077-087f-4b04-98c5-22e09450dcb3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500403 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500430 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prj5q\" (UniqueName: \"kubernetes.io/projected/6a34d077-087f-4b04-98c5-22e09450dcb3-kube-api-access-prj5q\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500522 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6a34d077-087f-4b04-98c5-22e09450dcb3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500548 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500613 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6a34d077-087f-4b04-98c5-22e09450dcb3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500663 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500682 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6a34d077-087f-4b04-98c5-22e09450dcb3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500772 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.500805 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.511337 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eabe6b07-da9d-4980-99b4-12c02640c88d-etc-swift\") pod \"swift-storage-0\" (UID: \"eabe6b07-da9d-4980-99b4-12c02640c88d\") " pod="openstack/swift-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.515061 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.602956 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.604407 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6a34d077-087f-4b04-98c5-22e09450dcb3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.604622 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.604750 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.604943 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-config\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.605172 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6a34d077-087f-4b04-98c5-22e09450dcb3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.605342 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.605444 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6a34d077-087f-4b04-98c5-22e09450dcb3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.605561 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.605662 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prj5q\" (UniqueName: \"kubernetes.io/projected/6a34d077-087f-4b04-98c5-22e09450dcb3-kube-api-access-prj5q\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.606646 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6a34d077-087f-4b04-98c5-22e09450dcb3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.606751 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.606995 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6a34d077-087f-4b04-98c5-22e09450dcb3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.609819 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6a34d077-087f-4b04-98c5-22e09450dcb3-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.624525 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.625098 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-config\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.625395 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.625421 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b8f571356fe2cb3489fcae1580a11d6ada33fdec8c8d1e0850e45e91197c9652/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.626411 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6a34d077-087f-4b04-98c5-22e09450dcb3-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.626625 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6a34d077-087f-4b04-98c5-22e09450dcb3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.626810 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6a34d077-087f-4b04-98c5-22e09450dcb3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.626998 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.630560 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.631946 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.632041 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6a34d077-087f-4b04-98c5-22e09450dcb3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.632095 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prj5q\" (UniqueName: \"kubernetes.io/projected/6a34d077-087f-4b04-98c5-22e09450dcb3-kube-api-access-prj5q\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.635325 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6a34d077-087f-4b04-98c5-22e09450dcb3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.690435 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1c836ef-ba24-4c50-a59b-e33260ae1ac3\") pod \"prometheus-metric-storage-0\" (UID: \"6a34d077-087f-4b04-98c5-22e09450dcb3\") " pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:54 crc kubenswrapper[4721]: I0202 13:22:54.704690 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 13:22:55 crc kubenswrapper[4721]: I0202 13:22:55.827877 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 02 13:22:55 crc kubenswrapper[4721]: I0202 13:22:55.889028 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:55 crc kubenswrapper[4721]: I0202 13:22:55.929813 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:55 crc kubenswrapper[4721]: I0202 13:22:55.944236 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpl4j\" (UniqueName: \"kubernetes.io/projected/3d5ab80d-40d3-4259-a13a-efb59d66b725-kube-api-access-fpl4j\") pod \"3d5ab80d-40d3-4259-a13a-efb59d66b725\" (UID: \"3d5ab80d-40d3-4259-a13a-efb59d66b725\") " Feb 02 13:22:55 crc kubenswrapper[4721]: I0202 13:22:55.944451 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d5ab80d-40d3-4259-a13a-efb59d66b725-operator-scripts\") pod \"3d5ab80d-40d3-4259-a13a-efb59d66b725\" (UID: \"3d5ab80d-40d3-4259-a13a-efb59d66b725\") " Feb 02 13:22:55 crc kubenswrapper[4721]: I0202 13:22:55.945564 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d5ab80d-40d3-4259-a13a-efb59d66b725-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d5ab80d-40d3-4259-a13a-efb59d66b725" (UID: "3d5ab80d-40d3-4259-a13a-efb59d66b725"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:55 crc kubenswrapper[4721]: I0202 13:22:55.946803 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d5ab80d-40d3-4259-a13a-efb59d66b725-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:55 crc kubenswrapper[4721]: I0202 13:22:55.959743 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5ab80d-40d3-4259-a13a-efb59d66b725-kube-api-access-fpl4j" (OuterVolumeSpecName: "kube-api-access-fpl4j") pod "3d5ab80d-40d3-4259-a13a-efb59d66b725" (UID: "3d5ab80d-40d3-4259-a13a-efb59d66b725"). InnerVolumeSpecName "kube-api-access-fpl4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:55 crc kubenswrapper[4721]: I0202 13:22:55.987678 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 13:22:55 crc kubenswrapper[4721]: W0202 13:22:55.989018 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a34d077_087f_4b04_98c5_22e09450dcb3.slice/crio-210ff9e6575bb6d9c6f06a24d54a63c727752ea3d455cc8dcc85c4b4bb3b8827 WatchSource:0}: Error finding container 210ff9e6575bb6d9c6f06a24d54a63c727752ea3d455cc8dcc85c4b4bb3b8827: Status 404 returned error can't find the container with id 210ff9e6575bb6d9c6f06a24d54a63c727752ea3d455cc8dcc85c4b4bb3b8827 Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.033982 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-l5h78" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.048478 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-scripts\") pod \"ffb2d2fc-d882-4343-a182-5d4cae12692f\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.048582 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run-ovn\") pod \"ffb2d2fc-d882-4343-a182-5d4cae12692f\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.048794 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8xf8\" (UniqueName: \"kubernetes.io/projected/ffb2d2fc-d882-4343-a182-5d4cae12692f-kube-api-access-t8xf8\") pod \"ffb2d2fc-d882-4343-a182-5d4cae12692f\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.048836 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-additional-scripts\") pod \"ffb2d2fc-d882-4343-a182-5d4cae12692f\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.048863 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-log-ovn\") pod \"ffb2d2fc-d882-4343-a182-5d4cae12692f\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.048891 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run\") pod \"ffb2d2fc-d882-4343-a182-5d4cae12692f\" (UID: \"ffb2d2fc-d882-4343-a182-5d4cae12692f\") " Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.049619 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpl4j\" (UniqueName: \"kubernetes.io/projected/3d5ab80d-40d3-4259-a13a-efb59d66b725-kube-api-access-fpl4j\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.049664 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run" (OuterVolumeSpecName: "var-run") pod "ffb2d2fc-d882-4343-a182-5d4cae12692f" (UID: "ffb2d2fc-d882-4343-a182-5d4cae12692f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.049929 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ffb2d2fc-d882-4343-a182-5d4cae12692f" (UID: "ffb2d2fc-d882-4343-a182-5d4cae12692f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.049970 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ffb2d2fc-d882-4343-a182-5d4cae12692f" (UID: "ffb2d2fc-d882-4343-a182-5d4cae12692f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.050051 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ffb2d2fc-d882-4343-a182-5d4cae12692f" (UID: "ffb2d2fc-d882-4343-a182-5d4cae12692f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.050596 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-scripts" (OuterVolumeSpecName: "scripts") pod "ffb2d2fc-d882-4343-a182-5d4cae12692f" (UID: "ffb2d2fc-d882-4343-a182-5d4cae12692f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.063478 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb2d2fc-d882-4343-a182-5d4cae12692f-kube-api-access-t8xf8" (OuterVolumeSpecName: "kube-api-access-t8xf8") pod "ffb2d2fc-d882-4343-a182-5d4cae12692f" (UID: "ffb2d2fc-d882-4343-a182-5d4cae12692f"). InnerVolumeSpecName "kube-api-access-t8xf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.153871 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.153914 4721 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.153927 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8xf8\" (UniqueName: \"kubernetes.io/projected/ffb2d2fc-d882-4343-a182-5d4cae12692f-kube-api-access-t8xf8\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.153941 4721 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ffb2d2fc-d882-4343-a182-5d4cae12692f-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.153954 4721 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.153964 4721 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ffb2d2fc-d882-4343-a182-5d4cae12692f-var-run\") on node \"crc\" DevicePath \"\"" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.177493 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.180456 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.180886 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.295566 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"2b37f2dfa80fc68b4e9a756361e61309fa1a16513ea3496c3920cb4c670eb9cf"} Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.310127 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-nftsl" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.310726 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-nftsl" event={"ID":"3d5ab80d-40d3-4259-a13a-efb59d66b725","Type":"ContainerDied","Data":"28498d7488cd8662de9104fba2a52e44054093c2b223f3e35ab776768caaf308"} Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.310793 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28498d7488cd8662de9104fba2a52e44054093c2b223f3e35ab776768caaf308" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.315929 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7a6930c7-1819-4b7d-baf6-773a8b68e568","Type":"ContainerStarted","Data":"425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73"} Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.325349 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6a34d077-087f-4b04-98c5-22e09450dcb3","Type":"ContainerStarted","Data":"210ff9e6575bb6d9c6f06a24d54a63c727752ea3d455cc8dcc85c4b4bb3b8827"} Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.334765 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l5h78-config-5bmh9" event={"ID":"ffb2d2fc-d882-4343-a182-5d4cae12692f","Type":"ContainerDied","Data":"834db9ab0bc9ea710cdeeae1fdd49ed39cc3b00bb16ebc533c33cc39366d14fc"} Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.334808 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="834db9ab0bc9ea710cdeeae1fdd49ed39cc3b00bb16ebc533c33cc39366d14fc" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.334816 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78-config-5bmh9" Feb 02 13:22:56 crc kubenswrapper[4721]: I0202 13:22:56.360334 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=8.777589048 podStartE2EDuration="10.360310169s" podCreationTimestamp="2026-02-02 13:22:46 +0000 UTC" firstStartedPulling="2026-02-02 13:22:53.615188372 +0000 UTC m=+1313.917702761" lastFinishedPulling="2026-02-02 13:22:55.197909493 +0000 UTC m=+1315.500423882" observedRunningTime="2026-02-02 13:22:56.3558965 +0000 UTC m=+1316.658410889" watchObservedRunningTime="2026-02-02 13:22:56.360310169 +0000 UTC m=+1316.662824558" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.055046 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-l5h78-config-5bmh9"] Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.063773 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-l5h78-config-5bmh9"] Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.185746 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-l5h78-config-ffx5v"] Feb 02 13:22:57 crc kubenswrapper[4721]: E0202 13:22:57.186313 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5ab80d-40d3-4259-a13a-efb59d66b725" containerName="mariadb-account-create-update" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.186339 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5ab80d-40d3-4259-a13a-efb59d66b725" containerName="mariadb-account-create-update" Feb 02 13:22:57 crc kubenswrapper[4721]: E0202 13:22:57.186417 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb2d2fc-d882-4343-a182-5d4cae12692f" containerName="ovn-config" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.186429 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb2d2fc-d882-4343-a182-5d4cae12692f" containerName="ovn-config" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.186683 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5ab80d-40d3-4259-a13a-efb59d66b725" containerName="mariadb-account-create-update" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.186723 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb2d2fc-d882-4343-a182-5d4cae12692f" containerName="ovn-config" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.187648 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.193570 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.195035 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxj28\" (UniqueName: \"kubernetes.io/projected/aad06c3d-8212-4a61-b491-2af939014fd6-kube-api-access-pxj28\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.195215 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.195258 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run-ovn\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.195412 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-additional-scripts\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.195463 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-log-ovn\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.195495 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-scripts\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.212699 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l5h78-config-ffx5v"] Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.297177 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-additional-scripts\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.297253 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-log-ovn\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.297277 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-scripts\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.297360 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxj28\" (UniqueName: \"kubernetes.io/projected/aad06c3d-8212-4a61-b491-2af939014fd6-kube-api-access-pxj28\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.297394 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.297416 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run-ovn\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.297740 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run-ovn\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.297738 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-log-ovn\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.297795 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.298387 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-additional-scripts\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.301329 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-scripts\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.329424 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxj28\" (UniqueName: \"kubernetes.io/projected/aad06c3d-8212-4a61-b491-2af939014fd6-kube-api-access-pxj28\") pod \"ovn-controller-l5h78-config-ffx5v\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.522147 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.805762 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-nftsl"] Feb 02 13:22:57 crc kubenswrapper[4721]: I0202 13:22:57.816171 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-nftsl"] Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.131304 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l5h78-config-ffx5v"] Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.214858 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2whnq"] Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.216262 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2whnq" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.233747 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2whnq"] Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.402608 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"3db13f1bd0d6b258fdb59622f982629010d6c369785373bbc3462937f58c58ae"} Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.404001 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l5h78-config-ffx5v" event={"ID":"aad06c3d-8212-4a61-b491-2af939014fd6","Type":"ContainerStarted","Data":"4ada9cfeb6a025a91e338410d7105720ab2f63a7cd8ce8f9992f12746a0d195c"} Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.417370 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwcg5\" (UniqueName: \"kubernetes.io/projected/937d142a-7868-4de2-85f3-90dcc5a74019-kube-api-access-dwcg5\") pod \"barbican-db-create-2whnq\" (UID: \"937d142a-7868-4de2-85f3-90dcc5a74019\") " pod="openstack/barbican-db-create-2whnq" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.417586 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d142a-7868-4de2-85f3-90dcc5a74019-operator-scripts\") pod \"barbican-db-create-2whnq\" (UID: \"937d142a-7868-4de2-85f3-90dcc5a74019\") " pod="openstack/barbican-db-create-2whnq" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.442663 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5ab80d-40d3-4259-a13a-efb59d66b725" path="/var/lib/kubelet/pods/3d5ab80d-40d3-4259-a13a-efb59d66b725/volumes" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.443287 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffb2d2fc-d882-4343-a182-5d4cae12692f" path="/var/lib/kubelet/pods/ffb2d2fc-d882-4343-a182-5d4cae12692f/volumes" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.509182 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-15d5-account-create-update-5kl6r"] Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.520634 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d142a-7868-4de2-85f3-90dcc5a74019-operator-scripts\") pod \"barbican-db-create-2whnq\" (UID: \"937d142a-7868-4de2-85f3-90dcc5a74019\") " pod="openstack/barbican-db-create-2whnq" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.520741 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwcg5\" (UniqueName: \"kubernetes.io/projected/937d142a-7868-4de2-85f3-90dcc5a74019-kube-api-access-dwcg5\") pod \"barbican-db-create-2whnq\" (UID: \"937d142a-7868-4de2-85f3-90dcc5a74019\") " pod="openstack/barbican-db-create-2whnq" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.522817 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d142a-7868-4de2-85f3-90dcc5a74019-operator-scripts\") pod \"barbican-db-create-2whnq\" (UID: \"937d142a-7868-4de2-85f3-90dcc5a74019\") " pod="openstack/barbican-db-create-2whnq" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.528803 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.537301 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.553431 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-15d5-account-create-update-5kl6r"] Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.587555 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwcg5\" (UniqueName: \"kubernetes.io/projected/937d142a-7868-4de2-85f3-90dcc5a74019-kube-api-access-dwcg5\") pod \"barbican-db-create-2whnq\" (UID: \"937d142a-7868-4de2-85f3-90dcc5a74019\") " pod="openstack/barbican-db-create-2whnq" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.623377 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f56b66-72ae-4c95-8051-dc5f7a0faec4-operator-scripts\") pod \"barbican-15d5-account-create-update-5kl6r\" (UID: \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\") " pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.623510 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9twnj\" (UniqueName: \"kubernetes.io/projected/67f56b66-72ae-4c95-8051-dc5f7a0faec4-kube-api-access-9twnj\") pod \"barbican-15d5-account-create-update-5kl6r\" (UID: \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\") " pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.713059 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-xmp7t"] Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.714518 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-xmp7t" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.726576 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f56b66-72ae-4c95-8051-dc5f7a0faec4-operator-scripts\") pod \"barbican-15d5-account-create-update-5kl6r\" (UID: \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\") " pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.726716 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9twnj\" (UniqueName: \"kubernetes.io/projected/67f56b66-72ae-4c95-8051-dc5f7a0faec4-kube-api-access-9twnj\") pod \"barbican-15d5-account-create-update-5kl6r\" (UID: \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\") " pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.727900 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f56b66-72ae-4c95-8051-dc5f7a0faec4-operator-scripts\") pod \"barbican-15d5-account-create-update-5kl6r\" (UID: \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\") " pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.745678 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-xmp7t"] Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.794931 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9twnj\" (UniqueName: \"kubernetes.io/projected/67f56b66-72ae-4c95-8051-dc5f7a0faec4-kube-api-access-9twnj\") pod \"barbican-15d5-account-create-update-5kl6r\" (UID: \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\") " pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.833976 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-operator-scripts\") pod \"heat-db-create-xmp7t\" (UID: \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\") " pod="openstack/heat-db-create-xmp7t" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.834493 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrlbg\" (UniqueName: \"kubernetes.io/projected/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-kube-api-access-mrlbg\") pod \"heat-db-create-xmp7t\" (UID: \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\") " pod="openstack/heat-db-create-xmp7t" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.866225 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2whnq" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.867791 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.946379 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrlbg\" (UniqueName: \"kubernetes.io/projected/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-kube-api-access-mrlbg\") pod \"heat-db-create-xmp7t\" (UID: \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\") " pod="openstack/heat-db-create-xmp7t" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.946697 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-operator-scripts\") pod \"heat-db-create-xmp7t\" (UID: \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\") " pod="openstack/heat-db-create-xmp7t" Feb 02 13:22:58 crc kubenswrapper[4721]: I0202 13:22:58.959765 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-operator-scripts\") pod \"heat-db-create-xmp7t\" (UID: \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\") " pod="openstack/heat-db-create-xmp7t" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.048152 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-6f5e-account-create-update-nk8v6"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.081357 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-wffvl"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.086597 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.091837 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.092650 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wffvl"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.092764 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wffvl" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.129645 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-6f5e-account-create-update-nk8v6"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.173255 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrlbg\" (UniqueName: \"kubernetes.io/projected/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-kube-api-access-mrlbg\") pod \"heat-db-create-xmp7t\" (UID: \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\") " pod="openstack/heat-db-create-xmp7t" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.176007 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-xmp7t" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.225981 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-operator-scripts\") pod \"heat-6f5e-account-create-update-nk8v6\" (UID: \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\") " pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.227981 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bspjq\" (UniqueName: \"kubernetes.io/projected/13666544-a226-43ee-84c9-3232e9fff8d4-kube-api-access-bspjq\") pod \"cinder-db-create-wffvl\" (UID: \"13666544-a226-43ee-84c9-3232e9fff8d4\") " pod="openstack/cinder-db-create-wffvl" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.228880 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vckvd\" (UniqueName: \"kubernetes.io/projected/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-kube-api-access-vckvd\") pod \"heat-6f5e-account-create-update-nk8v6\" (UID: \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\") " pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.230021 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13666544-a226-43ee-84c9-3232e9fff8d4-operator-scripts\") pod \"cinder-db-create-wffvl\" (UID: \"13666544-a226-43ee-84c9-3232e9fff8d4\") " pod="openstack/cinder-db-create-wffvl" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.276657 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-xs4g5"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.278563 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xs4g5" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.291756 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-219b-account-create-update-c48ml"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.293581 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.295571 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.310261 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xs4g5"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.326450 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-219b-account-create-update-c48ml"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.335899 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bspjq\" (UniqueName: \"kubernetes.io/projected/13666544-a226-43ee-84c9-3232e9fff8d4-kube-api-access-bspjq\") pod \"cinder-db-create-wffvl\" (UID: \"13666544-a226-43ee-84c9-3232e9fff8d4\") " pod="openstack/cinder-db-create-wffvl" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.335965 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vckvd\" (UniqueName: \"kubernetes.io/projected/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-kube-api-access-vckvd\") pod \"heat-6f5e-account-create-update-nk8v6\" (UID: \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\") " pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.336012 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13666544-a226-43ee-84c9-3232e9fff8d4-operator-scripts\") pod \"cinder-db-create-wffvl\" (UID: \"13666544-a226-43ee-84c9-3232e9fff8d4\") " pod="openstack/cinder-db-create-wffvl" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.336063 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ad4533-c6a5-49da-8f33-23113f8b7fea-operator-scripts\") pod \"neutron-db-create-xs4g5\" (UID: \"45ad4533-c6a5-49da-8f33-23113f8b7fea\") " pod="openstack/neutron-db-create-xs4g5" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.336100 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8m4n\" (UniqueName: \"kubernetes.io/projected/45ad4533-c6a5-49da-8f33-23113f8b7fea-kube-api-access-f8m4n\") pod \"neutron-db-create-xs4g5\" (UID: \"45ad4533-c6a5-49da-8f33-23113f8b7fea\") " pod="openstack/neutron-db-create-xs4g5" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.336133 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-operator-scripts\") pod \"heat-6f5e-account-create-update-nk8v6\" (UID: \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\") " pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.336906 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-operator-scripts\") pod \"heat-6f5e-account-create-update-nk8v6\" (UID: \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\") " pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.337838 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13666544-a226-43ee-84c9-3232e9fff8d4-operator-scripts\") pod \"cinder-db-create-wffvl\" (UID: \"13666544-a226-43ee-84c9-3232e9fff8d4\") " pod="openstack/cinder-db-create-wffvl" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.342130 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-wvp2n"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.343871 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.348792 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.350403 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.350763 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.351009 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7z9s7" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.358199 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wvp2n"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.378229 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vckvd\" (UniqueName: \"kubernetes.io/projected/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-kube-api-access-vckvd\") pod \"heat-6f5e-account-create-update-nk8v6\" (UID: \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\") " pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.378750 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bspjq\" (UniqueName: \"kubernetes.io/projected/13666544-a226-43ee-84c9-3232e9fff8d4-kube-api-access-bspjq\") pod \"cinder-db-create-wffvl\" (UID: \"13666544-a226-43ee-84c9-3232e9fff8d4\") " pod="openstack/cinder-db-create-wffvl" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.418330 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9865-account-create-update-5xd7v"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.420267 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.422563 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.438456 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-combined-ca-bundle\") pod \"keystone-db-sync-wvp2n\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.438530 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjqv\" (UniqueName: \"kubernetes.io/projected/3395efa1-7b43-4b48-9e06-764b9428c5ab-kube-api-access-ppjqv\") pod \"keystone-db-sync-wvp2n\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.438603 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b707f0-c9cf-46b5-b615-4c0ab1da0391-operator-scripts\") pod \"cinder-219b-account-create-update-c48ml\" (UID: \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\") " pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.438662 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xshcx\" (UniqueName: \"kubernetes.io/projected/46b707f0-c9cf-46b5-b615-4c0ab1da0391-kube-api-access-xshcx\") pod \"cinder-219b-account-create-update-c48ml\" (UID: \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\") " pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.438737 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ad4533-c6a5-49da-8f33-23113f8b7fea-operator-scripts\") pod \"neutron-db-create-xs4g5\" (UID: \"45ad4533-c6a5-49da-8f33-23113f8b7fea\") " pod="openstack/neutron-db-create-xs4g5" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.438763 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8m4n\" (UniqueName: \"kubernetes.io/projected/45ad4533-c6a5-49da-8f33-23113f8b7fea-kube-api-access-f8m4n\") pod \"neutron-db-create-xs4g5\" (UID: \"45ad4533-c6a5-49da-8f33-23113f8b7fea\") " pod="openstack/neutron-db-create-xs4g5" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.438785 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-config-data\") pod \"keystone-db-sync-wvp2n\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.439506 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ad4533-c6a5-49da-8f33-23113f8b7fea-operator-scripts\") pod \"neutron-db-create-xs4g5\" (UID: \"45ad4533-c6a5-49da-8f33-23113f8b7fea\") " pod="openstack/neutron-db-create-xs4g5" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.440473 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9865-account-create-update-5xd7v"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.449222 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l5h78-config-ffx5v" event={"ID":"aad06c3d-8212-4a61-b491-2af939014fd6","Type":"ContainerStarted","Data":"53b5d624776f4223952c5574dd921abb5bb1d5c538eeb2e730f20d491cd8ec2a"} Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.451371 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.468272 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wffvl" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.504963 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-l5h78-config-ffx5v" podStartSLOduration=2.504941958 podStartE2EDuration="2.504941958s" podCreationTimestamp="2026-02-02 13:22:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:22:59.491667769 +0000 UTC m=+1319.794182158" watchObservedRunningTime="2026-02-02 13:22:59.504941958 +0000 UTC m=+1319.807456347" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.524672 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8m4n\" (UniqueName: \"kubernetes.io/projected/45ad4533-c6a5-49da-8f33-23113f8b7fea-kube-api-access-f8m4n\") pod \"neutron-db-create-xs4g5\" (UID: \"45ad4533-c6a5-49da-8f33-23113f8b7fea\") " pod="openstack/neutron-db-create-xs4g5" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.540498 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xshcx\" (UniqueName: \"kubernetes.io/projected/46b707f0-c9cf-46b5-b615-4c0ab1da0391-kube-api-access-xshcx\") pod \"cinder-219b-account-create-update-c48ml\" (UID: \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\") " pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.540649 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-config-data\") pod \"keystone-db-sync-wvp2n\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.540747 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-combined-ca-bundle\") pod \"keystone-db-sync-wvp2n\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.540780 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92mfj\" (UniqueName: \"kubernetes.io/projected/375b0aad-b921-41d8-af30-181ac4a73c0b-kube-api-access-92mfj\") pod \"neutron-9865-account-create-update-5xd7v\" (UID: \"375b0aad-b921-41d8-af30-181ac4a73c0b\") " pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.540851 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjqv\" (UniqueName: \"kubernetes.io/projected/3395efa1-7b43-4b48-9e06-764b9428c5ab-kube-api-access-ppjqv\") pod \"keystone-db-sync-wvp2n\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.540884 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/375b0aad-b921-41d8-af30-181ac4a73c0b-operator-scripts\") pod \"neutron-9865-account-create-update-5xd7v\" (UID: \"375b0aad-b921-41d8-af30-181ac4a73c0b\") " pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.540966 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b707f0-c9cf-46b5-b615-4c0ab1da0391-operator-scripts\") pod \"cinder-219b-account-create-update-c48ml\" (UID: \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\") " pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.544373 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-config-data\") pod \"keystone-db-sync-wvp2n\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.544552 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b707f0-c9cf-46b5-b615-4c0ab1da0391-operator-scripts\") pod \"cinder-219b-account-create-update-c48ml\" (UID: \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\") " pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.544872 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-combined-ca-bundle\") pod \"keystone-db-sync-wvp2n\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.564588 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjqv\" (UniqueName: \"kubernetes.io/projected/3395efa1-7b43-4b48-9e06-764b9428c5ab-kube-api-access-ppjqv\") pod \"keystone-db-sync-wvp2n\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.574577 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xshcx\" (UniqueName: \"kubernetes.io/projected/46b707f0-c9cf-46b5-b615-4c0ab1da0391-kube-api-access-xshcx\") pod \"cinder-219b-account-create-update-c48ml\" (UID: \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\") " pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.618545 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xs4g5" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.642553 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92mfj\" (UniqueName: \"kubernetes.io/projected/375b0aad-b921-41d8-af30-181ac4a73c0b-kube-api-access-92mfj\") pod \"neutron-9865-account-create-update-5xd7v\" (UID: \"375b0aad-b921-41d8-af30-181ac4a73c0b\") " pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.642622 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/375b0aad-b921-41d8-af30-181ac4a73c0b-operator-scripts\") pod \"neutron-9865-account-create-update-5xd7v\" (UID: \"375b0aad-b921-41d8-af30-181ac4a73c0b\") " pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.643480 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/375b0aad-b921-41d8-af30-181ac4a73c0b-operator-scripts\") pod \"neutron-9865-account-create-update-5xd7v\" (UID: \"375b0aad-b921-41d8-af30-181ac4a73c0b\") " pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.664682 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92mfj\" (UniqueName: \"kubernetes.io/projected/375b0aad-b921-41d8-af30-181ac4a73c0b-kube-api-access-92mfj\") pod \"neutron-9865-account-create-update-5xd7v\" (UID: \"375b0aad-b921-41d8-af30-181ac4a73c0b\") " pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.758545 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2whnq"] Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.761500 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.785897 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:22:59 crc kubenswrapper[4721]: I0202 13:22:59.959014 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.117712 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-15d5-account-create-update-5kl6r"] Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.139433 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-xmp7t"] Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.543694 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"433b60737b3066082bb639518e6147a5a6503ebf9a4c06d09a6c97730c7824ba"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.544920 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"9afe80fe939a2ef5cddb97ad2017b2646fef9c58e3864240547b1fd3d8877107"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.544990 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"293cf162bd164320a8b9271bc53f14416d35b91812c9de3fdda210a674d4fe6c"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.599269 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6a34d077-087f-4b04-98c5-22e09450dcb3","Type":"ContainerStarted","Data":"d77970247eaad2de283a9d77316bf0b2e70ed8392f8f537a13730da5d5109f30"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.622529 4721 generic.go:334] "Generic (PLEG): container finished" podID="aad06c3d-8212-4a61-b491-2af939014fd6" containerID="53b5d624776f4223952c5574dd921abb5bb1d5c538eeb2e730f20d491cd8ec2a" exitCode=0 Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.622603 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l5h78-config-ffx5v" event={"ID":"aad06c3d-8212-4a61-b491-2af939014fd6","Type":"ContainerDied","Data":"53b5d624776f4223952c5574dd921abb5bb1d5c538eeb2e730f20d491cd8ec2a"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.658414 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2whnq" event={"ID":"937d142a-7868-4de2-85f3-90dcc5a74019","Type":"ContainerStarted","Data":"49e2f9d6a9f5b04c1ec533b19afe36a66018a912e5ef184f9b92ab178816de33"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.658447 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2whnq" event={"ID":"937d142a-7868-4de2-85f3-90dcc5a74019","Type":"ContainerStarted","Data":"ccbde8ddec5b3224b7082f9998adb72d6d3847a838f98d65a4e3c8f153647757"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.659975 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-6f5e-account-create-update-nk8v6"] Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.663946 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.671315 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-15d5-account-create-update-5kl6r" event={"ID":"67f56b66-72ae-4c95-8051-dc5f7a0faec4","Type":"ContainerStarted","Data":"27b5beb216b40a6bf47c26d5501508f64c30a9831dd1df313dcf922bbdd6bfbf"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.671383 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-15d5-account-create-update-5kl6r" event={"ID":"67f56b66-72ae-4c95-8051-dc5f7a0faec4","Type":"ContainerStarted","Data":"44f1c86ebd459ef3f37a415120f2dce3ea862497ab8630574d3be6ad60426784"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.679829 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wffvl"] Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.688379 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-xmp7t" event={"ID":"51bc4821-8b8e-4972-a90e-67a7a7b1fee5","Type":"ContainerStarted","Data":"5b1beefa4df5e3624727dd6828dfc7ef605a0f5abd94ccfd815abf8f0ff6c96a"} Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.706818 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-xs4g5"] Feb 02 13:23:00 crc kubenswrapper[4721]: W0202 13:23:00.735597 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46b707f0_c9cf_46b5_b615_4c0ab1da0391.slice/crio-f41f8d7a37c5122fc63cc4eb313e0c73b2f929235be1be06f31ae30167258bf6 WatchSource:0}: Error finding container f41f8d7a37c5122fc63cc4eb313e0c73b2f929235be1be06f31ae30167258bf6: Status 404 returned error can't find the container with id f41f8d7a37c5122fc63cc4eb313e0c73b2f929235be1be06f31ae30167258bf6 Feb 02 13:23:00 crc kubenswrapper[4721]: W0202 13:23:00.811131 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3395efa1_7b43_4b48_9e06_764b9428c5ab.slice/crio-9a4595f760619b3ba238fcbf4e59500ea2d92f85648541e4ad855bfe4ec9d3fa WatchSource:0}: Error finding container 9a4595f760619b3ba238fcbf4e59500ea2d92f85648541e4ad855bfe4ec9d3fa: Status 404 returned error can't find the container with id 9a4595f760619b3ba238fcbf4e59500ea2d92f85648541e4ad855bfe4ec9d3fa Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.829905 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.867375 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.872647 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-219b-account-create-update-c48ml"] Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.884161 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-wvp2n"] Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.901382 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-15d5-account-create-update-5kl6r" podStartSLOduration=2.9013588820000002 podStartE2EDuration="2.901358882s" podCreationTimestamp="2026-02-02 13:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:00.768248313 +0000 UTC m=+1321.070762702" watchObservedRunningTime="2026-02-02 13:23:00.901358882 +0000 UTC m=+1321.203873261" Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.913997 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-2whnq" podStartSLOduration=2.913975943 podStartE2EDuration="2.913975943s" podCreationTimestamp="2026-02-02 13:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:00.796822785 +0000 UTC m=+1321.099337174" watchObservedRunningTime="2026-02-02 13:23:00.913975943 +0000 UTC m=+1321.216490342" Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.934890 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9865-account-create-update-5xd7v"] Feb 02 13:23:00 crc kubenswrapper[4721]: I0202 13:23:00.937496 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-xmp7t" podStartSLOduration=2.9374794680000003 podStartE2EDuration="2.937479468s" podCreationTimestamp="2026-02-02 13:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:00.81143308 +0000 UTC m=+1321.113947469" watchObservedRunningTime="2026-02-02 13:23:00.937479468 +0000 UTC m=+1321.239993857" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.232583 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ncb4h"] Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.235196 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.238468 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.256747 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ncb4h"] Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.316021 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v98wl\" (UniqueName: \"kubernetes.io/projected/33682180-e2b1-4c20-a374-1a90e1ccea48-kube-api-access-v98wl\") pod \"root-account-create-update-ncb4h\" (UID: \"33682180-e2b1-4c20-a374-1a90e1ccea48\") " pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.316831 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33682180-e2b1-4c20-a374-1a90e1ccea48-operator-scripts\") pod \"root-account-create-update-ncb4h\" (UID: \"33682180-e2b1-4c20-a374-1a90e1ccea48\") " pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.418790 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33682180-e2b1-4c20-a374-1a90e1ccea48-operator-scripts\") pod \"root-account-create-update-ncb4h\" (UID: \"33682180-e2b1-4c20-a374-1a90e1ccea48\") " pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.418886 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v98wl\" (UniqueName: \"kubernetes.io/projected/33682180-e2b1-4c20-a374-1a90e1ccea48-kube-api-access-v98wl\") pod \"root-account-create-update-ncb4h\" (UID: \"33682180-e2b1-4c20-a374-1a90e1ccea48\") " pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.419720 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33682180-e2b1-4c20-a374-1a90e1ccea48-operator-scripts\") pod \"root-account-create-update-ncb4h\" (UID: \"33682180-e2b1-4c20-a374-1a90e1ccea48\") " pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.447780 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v98wl\" (UniqueName: \"kubernetes.io/projected/33682180-e2b1-4c20-a374-1a90e1ccea48-kube-api-access-v98wl\") pod \"root-account-create-update-ncb4h\" (UID: \"33682180-e2b1-4c20-a374-1a90e1ccea48\") " pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.616895 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.736569 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9865-account-create-update-5xd7v" event={"ID":"375b0aad-b921-41d8-af30-181ac4a73c0b","Type":"ContainerStarted","Data":"b4ecf1fb05394c16a116b372fdffdfa0e7375e6cf9e5a5e825266b2a826c68fd"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.736805 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9865-account-create-update-5xd7v" event={"ID":"375b0aad-b921-41d8-af30-181ac4a73c0b","Type":"ContainerStarted","Data":"974942fc39aed915ac1ed8e72bf664ad24873fb228a582d3a6ebf0cd4fa2ef59"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.755849 4721 generic.go:334] "Generic (PLEG): container finished" podID="937d142a-7868-4de2-85f3-90dcc5a74019" containerID="49e2f9d6a9f5b04c1ec533b19afe36a66018a912e5ef184f9b92ab178816de33" exitCode=0 Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.756042 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2whnq" event={"ID":"937d142a-7868-4de2-85f3-90dcc5a74019","Type":"ContainerDied","Data":"49e2f9d6a9f5b04c1ec533b19afe36a66018a912e5ef184f9b92ab178816de33"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.761080 4721 generic.go:334] "Generic (PLEG): container finished" podID="67f56b66-72ae-4c95-8051-dc5f7a0faec4" containerID="27b5beb216b40a6bf47c26d5501508f64c30a9831dd1df313dcf922bbdd6bfbf" exitCode=0 Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.761152 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-15d5-account-create-update-5kl6r" event={"ID":"67f56b66-72ae-4c95-8051-dc5f7a0faec4","Type":"ContainerDied","Data":"27b5beb216b40a6bf47c26d5501508f64c30a9831dd1df313dcf922bbdd6bfbf"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.763626 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xs4g5" event={"ID":"45ad4533-c6a5-49da-8f33-23113f8b7fea","Type":"ContainerStarted","Data":"ebaaea23221e663dcf2fea49186c159f19607ea1b1253a2c8f769a54c895b470"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.763724 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xs4g5" event={"ID":"45ad4533-c6a5-49da-8f33-23113f8b7fea","Type":"ContainerStarted","Data":"29e7addf8e62cdc1322715c0687562f2bbdba871b07331508c1e9cd6c6eb1e35"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.765921 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-219b-account-create-update-c48ml" event={"ID":"46b707f0-c9cf-46b5-b615-4c0ab1da0391","Type":"ContainerStarted","Data":"fdd34df09b38c8d4122669f8f4cba7de1072b0822ab1da7e4f47ca0b7bdd4576"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.765989 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-219b-account-create-update-c48ml" event={"ID":"46b707f0-c9cf-46b5-b615-4c0ab1da0391","Type":"ContainerStarted","Data":"f41f8d7a37c5122fc63cc4eb313e0c73b2f929235be1be06f31ae30167258bf6"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.768054 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9865-account-create-update-5xd7v" podStartSLOduration=2.768027253 podStartE2EDuration="2.768027253s" podCreationTimestamp="2026-02-02 13:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:01.75644509 +0000 UTC m=+1322.058959479" watchObservedRunningTime="2026-02-02 13:23:01.768027253 +0000 UTC m=+1322.070541652" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.768357 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6f5e-account-create-update-nk8v6" event={"ID":"0d74a2f0-9f60-4f59-92e4-11b9136f1db5","Type":"ContainerStarted","Data":"c60a5b8d8e8a5c3310b5b5e67ee44c42a8f4c0e6c7827776fd046304bab7b307"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.768392 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6f5e-account-create-update-nk8v6" event={"ID":"0d74a2f0-9f60-4f59-92e4-11b9136f1db5","Type":"ContainerStarted","Data":"169c0d730bc1c83a8cb544a1cea72333ffa248d90f19384ccb9cd170b9f67be0"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.769778 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wvp2n" event={"ID":"3395efa1-7b43-4b48-9e06-764b9428c5ab","Type":"ContainerStarted","Data":"9a4595f760619b3ba238fcbf4e59500ea2d92f85648541e4ad855bfe4ec9d3fa"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.779636 4721 generic.go:334] "Generic (PLEG): container finished" podID="13666544-a226-43ee-84c9-3232e9fff8d4" containerID="11d4406ea2aeef5800b1d48d5c16350e8f64df4bb7e540c2b8bb59f164e7298a" exitCode=0 Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.779829 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wffvl" event={"ID":"13666544-a226-43ee-84c9-3232e9fff8d4","Type":"ContainerDied","Data":"11d4406ea2aeef5800b1d48d5c16350e8f64df4bb7e540c2b8bb59f164e7298a"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.779896 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wffvl" event={"ID":"13666544-a226-43ee-84c9-3232e9fff8d4","Type":"ContainerStarted","Data":"0f7322d4b1604bf0b23f8dfd7fd198bcd3d01cc971e9e7e970f47421793f65ea"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.787660 4721 generic.go:334] "Generic (PLEG): container finished" podID="51bc4821-8b8e-4972-a90e-67a7a7b1fee5" containerID="075500f50301a059fcb64d3fd73cd025d86472e12f54ce995ede7cc3876a10cc" exitCode=0 Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.787729 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-xmp7t" event={"ID":"51bc4821-8b8e-4972-a90e-67a7a7b1fee5","Type":"ContainerDied","Data":"075500f50301a059fcb64d3fd73cd025d86472e12f54ce995ede7cc3876a10cc"} Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.802766 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-6f5e-account-create-update-nk8v6" podStartSLOduration=3.802747621 podStartE2EDuration="3.802747621s" podCreationTimestamp="2026-02-02 13:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:01.795266239 +0000 UTC m=+1322.097780638" watchObservedRunningTime="2026-02-02 13:23:01.802747621 +0000 UTC m=+1322.105262010" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.900335 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-xs4g5" podStartSLOduration=2.900250958 podStartE2EDuration="2.900250958s" podCreationTimestamp="2026-02-02 13:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:01.813817531 +0000 UTC m=+1322.116331920" watchObservedRunningTime="2026-02-02 13:23:01.900250958 +0000 UTC m=+1322.202765347" Feb 02 13:23:01 crc kubenswrapper[4721]: I0202 13:23:01.913395 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-219b-account-create-update-c48ml" podStartSLOduration=2.913375462 podStartE2EDuration="2.913375462s" podCreationTimestamp="2026-02-02 13:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:01.848436487 +0000 UTC m=+1322.150950906" watchObservedRunningTime="2026-02-02 13:23:01.913375462 +0000 UTC m=+1322.215889851" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.180694 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ncb4h"] Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.313901 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.371220 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-scripts\") pod \"aad06c3d-8212-4a61-b491-2af939014fd6\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.373593 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-scripts" (OuterVolumeSpecName: "scripts") pod "aad06c3d-8212-4a61-b491-2af939014fd6" (UID: "aad06c3d-8212-4a61-b491-2af939014fd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.373742 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run-ovn\") pod \"aad06c3d-8212-4a61-b491-2af939014fd6\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.373910 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-additional-scripts\") pod \"aad06c3d-8212-4a61-b491-2af939014fd6\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.374059 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-log-ovn\") pod \"aad06c3d-8212-4a61-b491-2af939014fd6\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.374118 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxj28\" (UniqueName: \"kubernetes.io/projected/aad06c3d-8212-4a61-b491-2af939014fd6-kube-api-access-pxj28\") pod \"aad06c3d-8212-4a61-b491-2af939014fd6\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.374226 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run\") pod \"aad06c3d-8212-4a61-b491-2af939014fd6\" (UID: \"aad06c3d-8212-4a61-b491-2af939014fd6\") " Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.374982 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.375023 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run" (OuterVolumeSpecName: "var-run") pod "aad06c3d-8212-4a61-b491-2af939014fd6" (UID: "aad06c3d-8212-4a61-b491-2af939014fd6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.375047 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "aad06c3d-8212-4a61-b491-2af939014fd6" (UID: "aad06c3d-8212-4a61-b491-2af939014fd6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.376556 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "aad06c3d-8212-4a61-b491-2af939014fd6" (UID: "aad06c3d-8212-4a61-b491-2af939014fd6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.376592 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "aad06c3d-8212-4a61-b491-2af939014fd6" (UID: "aad06c3d-8212-4a61-b491-2af939014fd6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.413475 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad06c3d-8212-4a61-b491-2af939014fd6-kube-api-access-pxj28" (OuterVolumeSpecName: "kube-api-access-pxj28") pod "aad06c3d-8212-4a61-b491-2af939014fd6" (UID: "aad06c3d-8212-4a61-b491-2af939014fd6"). InnerVolumeSpecName "kube-api-access-pxj28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.480121 4721 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.480176 4721 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/aad06c3d-8212-4a61-b491-2af939014fd6-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.480195 4721 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.480207 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxj28\" (UniqueName: \"kubernetes.io/projected/aad06c3d-8212-4a61-b491-2af939014fd6-kube-api-access-pxj28\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.480221 4721 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/aad06c3d-8212-4a61-b491-2af939014fd6-var-run\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.584675 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-l5h78-config-ffx5v"] Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.601892 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-l5h78-config-ffx5v"] Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.821784 4721 generic.go:334] "Generic (PLEG): container finished" podID="46b707f0-c9cf-46b5-b615-4c0ab1da0391" containerID="fdd34df09b38c8d4122669f8f4cba7de1072b0822ab1da7e4f47ca0b7bdd4576" exitCode=0 Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.824914 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-219b-account-create-update-c48ml" event={"ID":"46b707f0-c9cf-46b5-b615-4c0ab1da0391","Type":"ContainerDied","Data":"fdd34df09b38c8d4122669f8f4cba7de1072b0822ab1da7e4f47ca0b7bdd4576"} Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.831110 4721 generic.go:334] "Generic (PLEG): container finished" podID="0d74a2f0-9f60-4f59-92e4-11b9136f1db5" containerID="c60a5b8d8e8a5c3310b5b5e67ee44c42a8f4c0e6c7827776fd046304bab7b307" exitCode=0 Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.831173 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6f5e-account-create-update-nk8v6" event={"ID":"0d74a2f0-9f60-4f59-92e4-11b9136f1db5","Type":"ContainerDied","Data":"c60a5b8d8e8a5c3310b5b5e67ee44c42a8f4c0e6c7827776fd046304bab7b307"} Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.833563 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ncb4h" event={"ID":"33682180-e2b1-4c20-a374-1a90e1ccea48","Type":"ContainerStarted","Data":"b46df9c1fd68942af3a0d55f5470efc7009a80b9e946652e536d88fee6b65e76"} Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.851925 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ada9cfeb6a025a91e338410d7105720ab2f63a7cd8ce8f9992f12746a0d195c" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.852106 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l5h78-config-ffx5v" Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.862932 4721 generic.go:334] "Generic (PLEG): container finished" podID="375b0aad-b921-41d8-af30-181ac4a73c0b" containerID="b4ecf1fb05394c16a116b372fdffdfa0e7375e6cf9e5a5e825266b2a826c68fd" exitCode=0 Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.863161 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9865-account-create-update-5xd7v" event={"ID":"375b0aad-b921-41d8-af30-181ac4a73c0b","Type":"ContainerDied","Data":"b4ecf1fb05394c16a116b372fdffdfa0e7375e6cf9e5a5e825266b2a826c68fd"} Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.867487 4721 generic.go:334] "Generic (PLEG): container finished" podID="45ad4533-c6a5-49da-8f33-23113f8b7fea" containerID="ebaaea23221e663dcf2fea49186c159f19607ea1b1253a2c8f769a54c895b470" exitCode=0 Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.867770 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xs4g5" event={"ID":"45ad4533-c6a5-49da-8f33-23113f8b7fea","Type":"ContainerDied","Data":"ebaaea23221e663dcf2fea49186c159f19607ea1b1253a2c8f769a54c895b470"} Feb 02 13:23:02 crc kubenswrapper[4721]: I0202 13:23:02.915934 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-ncb4h" podStartSLOduration=1.915911707 podStartE2EDuration="1.915911707s" podCreationTimestamp="2026-02-02 13:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:02.891350674 +0000 UTC m=+1323.193865083" watchObservedRunningTime="2026-02-02 13:23:02.915911707 +0000 UTC m=+1323.218426086" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.483980 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2whnq" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.650792 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d142a-7868-4de2-85f3-90dcc5a74019-operator-scripts\") pod \"937d142a-7868-4de2-85f3-90dcc5a74019\" (UID: \"937d142a-7868-4de2-85f3-90dcc5a74019\") " Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.651242 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwcg5\" (UniqueName: \"kubernetes.io/projected/937d142a-7868-4de2-85f3-90dcc5a74019-kube-api-access-dwcg5\") pod \"937d142a-7868-4de2-85f3-90dcc5a74019\" (UID: \"937d142a-7868-4de2-85f3-90dcc5a74019\") " Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.653260 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/937d142a-7868-4de2-85f3-90dcc5a74019-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "937d142a-7868-4de2-85f3-90dcc5a74019" (UID: "937d142a-7868-4de2-85f3-90dcc5a74019"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.653382 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/937d142a-7868-4de2-85f3-90dcc5a74019-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.664486 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/937d142a-7868-4de2-85f3-90dcc5a74019-kube-api-access-dwcg5" (OuterVolumeSpecName: "kube-api-access-dwcg5") pod "937d142a-7868-4de2-85f3-90dcc5a74019" (UID: "937d142a-7868-4de2-85f3-90dcc5a74019"). InnerVolumeSpecName "kube-api-access-dwcg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.755262 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwcg5\" (UniqueName: \"kubernetes.io/projected/937d142a-7868-4de2-85f3-90dcc5a74019-kube-api-access-dwcg5\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.801600 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.815435 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wffvl" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.872080 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-xmp7t" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.884422 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wffvl" event={"ID":"13666544-a226-43ee-84c9-3232e9fff8d4","Type":"ContainerDied","Data":"0f7322d4b1604bf0b23f8dfd7fd198bcd3d01cc971e9e7e970f47421793f65ea"} Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.884469 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f7322d4b1604bf0b23f8dfd7fd198bcd3d01cc971e9e7e970f47421793f65ea" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.884539 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wffvl" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.896840 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2whnq" event={"ID":"937d142a-7868-4de2-85f3-90dcc5a74019","Type":"ContainerDied","Data":"ccbde8ddec5b3224b7082f9998adb72d6d3847a838f98d65a4e3c8f153647757"} Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.896889 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccbde8ddec5b3224b7082f9998adb72d6d3847a838f98d65a4e3c8f153647757" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.896957 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2whnq" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.903378 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-15d5-account-create-update-5kl6r" event={"ID":"67f56b66-72ae-4c95-8051-dc5f7a0faec4","Type":"ContainerDied","Data":"44f1c86ebd459ef3f37a415120f2dce3ea862497ab8630574d3be6ad60426784"} Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.903455 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44f1c86ebd459ef3f37a415120f2dce3ea862497ab8630574d3be6ad60426784" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.903529 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-15d5-account-create-update-5kl6r" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.919342 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"4b0459d96819ed1c4f94cc04d9655e4896c56b29043a3b7e3bdc6c07268930ff"} Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.919384 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"a8c25675fe65b31f7b4fc38d6f04f21bc1b4ad8668c35b5c6d3da5a1d9b594f2"} Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.919397 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"e1d47512b2f14259ddeaad0adf0863d2bd4091f41d5f1ce812357618be0b36ec"} Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.924757 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-xmp7t" event={"ID":"51bc4821-8b8e-4972-a90e-67a7a7b1fee5","Type":"ContainerDied","Data":"5b1beefa4df5e3624727dd6828dfc7ef605a0f5abd94ccfd815abf8f0ff6c96a"} Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.924833 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b1beefa4df5e3624727dd6828dfc7ef605a0f5abd94ccfd815abf8f0ff6c96a" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.924904 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-xmp7t" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.938084 4721 generic.go:334] "Generic (PLEG): container finished" podID="33682180-e2b1-4c20-a374-1a90e1ccea48" containerID="ebcd143f9cd75602d2879409dcec3b4694439187ff4c0fda35cb07bd211f9634" exitCode=0 Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.939698 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ncb4h" event={"ID":"33682180-e2b1-4c20-a374-1a90e1ccea48","Type":"ContainerDied","Data":"ebcd143f9cd75602d2879409dcec3b4694439187ff4c0fda35cb07bd211f9634"} Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.963619 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13666544-a226-43ee-84c9-3232e9fff8d4-operator-scripts\") pod \"13666544-a226-43ee-84c9-3232e9fff8d4\" (UID: \"13666544-a226-43ee-84c9-3232e9fff8d4\") " Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.963742 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bspjq\" (UniqueName: \"kubernetes.io/projected/13666544-a226-43ee-84c9-3232e9fff8d4-kube-api-access-bspjq\") pod \"13666544-a226-43ee-84c9-3232e9fff8d4\" (UID: \"13666544-a226-43ee-84c9-3232e9fff8d4\") " Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.963797 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f56b66-72ae-4c95-8051-dc5f7a0faec4-operator-scripts\") pod \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\" (UID: \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\") " Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.963859 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrlbg\" (UniqueName: \"kubernetes.io/projected/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-kube-api-access-mrlbg\") pod \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\" (UID: \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\") " Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.963950 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-operator-scripts\") pod \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\" (UID: \"51bc4821-8b8e-4972-a90e-67a7a7b1fee5\") " Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.964086 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9twnj\" (UniqueName: \"kubernetes.io/projected/67f56b66-72ae-4c95-8051-dc5f7a0faec4-kube-api-access-9twnj\") pod \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\" (UID: \"67f56b66-72ae-4c95-8051-dc5f7a0faec4\") " Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.964154 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13666544-a226-43ee-84c9-3232e9fff8d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "13666544-a226-43ee-84c9-3232e9fff8d4" (UID: "13666544-a226-43ee-84c9-3232e9fff8d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.964590 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13666544-a226-43ee-84c9-3232e9fff8d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.965634 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51bc4821-8b8e-4972-a90e-67a7a7b1fee5" (UID: "51bc4821-8b8e-4972-a90e-67a7a7b1fee5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.965822 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f56b66-72ae-4c95-8051-dc5f7a0faec4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67f56b66-72ae-4c95-8051-dc5f7a0faec4" (UID: "67f56b66-72ae-4c95-8051-dc5f7a0faec4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.970521 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f56b66-72ae-4c95-8051-dc5f7a0faec4-kube-api-access-9twnj" (OuterVolumeSpecName: "kube-api-access-9twnj") pod "67f56b66-72ae-4c95-8051-dc5f7a0faec4" (UID: "67f56b66-72ae-4c95-8051-dc5f7a0faec4"). InnerVolumeSpecName "kube-api-access-9twnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.970586 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13666544-a226-43ee-84c9-3232e9fff8d4-kube-api-access-bspjq" (OuterVolumeSpecName: "kube-api-access-bspjq") pod "13666544-a226-43ee-84c9-3232e9fff8d4" (UID: "13666544-a226-43ee-84c9-3232e9fff8d4"). InnerVolumeSpecName "kube-api-access-bspjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:03 crc kubenswrapper[4721]: I0202 13:23:03.976589 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-kube-api-access-mrlbg" (OuterVolumeSpecName: "kube-api-access-mrlbg") pod "51bc4821-8b8e-4972-a90e-67a7a7b1fee5" (UID: "51bc4821-8b8e-4972-a90e-67a7a7b1fee5"). InnerVolumeSpecName "kube-api-access-mrlbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.066113 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9twnj\" (UniqueName: \"kubernetes.io/projected/67f56b66-72ae-4c95-8051-dc5f7a0faec4-kube-api-access-9twnj\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.066440 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bspjq\" (UniqueName: \"kubernetes.io/projected/13666544-a226-43ee-84c9-3232e9fff8d4-kube-api-access-bspjq\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.066451 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67f56b66-72ae-4c95-8051-dc5f7a0faec4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.066464 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrlbg\" (UniqueName: \"kubernetes.io/projected/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-kube-api-access-mrlbg\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.066472 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51bc4821-8b8e-4972-a90e-67a7a7b1fee5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.440261 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad06c3d-8212-4a61-b491-2af939014fd6" path="/var/lib/kubelet/pods/aad06c3d-8212-4a61-b491-2af939014fd6/volumes" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.494450 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.581652 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xshcx\" (UniqueName: \"kubernetes.io/projected/46b707f0-c9cf-46b5-b615-4c0ab1da0391-kube-api-access-xshcx\") pod \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\" (UID: \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\") " Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.581768 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b707f0-c9cf-46b5-b615-4c0ab1da0391-operator-scripts\") pod \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\" (UID: \"46b707f0-c9cf-46b5-b615-4c0ab1da0391\") " Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.582441 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46b707f0-c9cf-46b5-b615-4c0ab1da0391-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46b707f0-c9cf-46b5-b615-4c0ab1da0391" (UID: "46b707f0-c9cf-46b5-b615-4c0ab1da0391"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.584519 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46b707f0-c9cf-46b5-b615-4c0ab1da0391-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.602381 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b707f0-c9cf-46b5-b615-4c0ab1da0391-kube-api-access-xshcx" (OuterVolumeSpecName: "kube-api-access-xshcx") pod "46b707f0-c9cf-46b5-b615-4c0ab1da0391" (UID: "46b707f0-c9cf-46b5-b615-4c0ab1da0391"). InnerVolumeSpecName "kube-api-access-xshcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.698511 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xshcx\" (UniqueName: \"kubernetes.io/projected/46b707f0-c9cf-46b5-b615-4c0ab1da0391-kube-api-access-xshcx\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.799391 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.817619 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xs4g5" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.850384 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.902139 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vckvd\" (UniqueName: \"kubernetes.io/projected/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-kube-api-access-vckvd\") pod \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\" (UID: \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\") " Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.902280 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8m4n\" (UniqueName: \"kubernetes.io/projected/45ad4533-c6a5-49da-8f33-23113f8b7fea-kube-api-access-f8m4n\") pod \"45ad4533-c6a5-49da-8f33-23113f8b7fea\" (UID: \"45ad4533-c6a5-49da-8f33-23113f8b7fea\") " Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.902359 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ad4533-c6a5-49da-8f33-23113f8b7fea-operator-scripts\") pod \"45ad4533-c6a5-49da-8f33-23113f8b7fea\" (UID: \"45ad4533-c6a5-49da-8f33-23113f8b7fea\") " Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.902408 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-operator-scripts\") pod \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\" (UID: \"0d74a2f0-9f60-4f59-92e4-11b9136f1db5\") " Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.905978 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d74a2f0-9f60-4f59-92e4-11b9136f1db5" (UID: "0d74a2f0-9f60-4f59-92e4-11b9136f1db5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.906541 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ad4533-c6a5-49da-8f33-23113f8b7fea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45ad4533-c6a5-49da-8f33-23113f8b7fea" (UID: "45ad4533-c6a5-49da-8f33-23113f8b7fea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.918560 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-kube-api-access-vckvd" (OuterVolumeSpecName: "kube-api-access-vckvd") pod "0d74a2f0-9f60-4f59-92e4-11b9136f1db5" (UID: "0d74a2f0-9f60-4f59-92e4-11b9136f1db5"). InnerVolumeSpecName "kube-api-access-vckvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.921203 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ad4533-c6a5-49da-8f33-23113f8b7fea-kube-api-access-f8m4n" (OuterVolumeSpecName: "kube-api-access-f8m4n") pod "45ad4533-c6a5-49da-8f33-23113f8b7fea" (UID: "45ad4533-c6a5-49da-8f33-23113f8b7fea"). InnerVolumeSpecName "kube-api-access-f8m4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.966746 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6f5e-account-create-update-nk8v6" event={"ID":"0d74a2f0-9f60-4f59-92e4-11b9136f1db5","Type":"ContainerDied","Data":"169c0d730bc1c83a8cb544a1cea72333ffa248d90f19384ccb9cd170b9f67be0"} Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.966794 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169c0d730bc1c83a8cb544a1cea72333ffa248d90f19384ccb9cd170b9f67be0" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.966875 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6f5e-account-create-update-nk8v6" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.968895 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9865-account-create-update-5xd7v" event={"ID":"375b0aad-b921-41d8-af30-181ac4a73c0b","Type":"ContainerDied","Data":"974942fc39aed915ac1ed8e72bf664ad24873fb228a582d3a6ebf0cd4fa2ef59"} Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.968935 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="974942fc39aed915ac1ed8e72bf664ad24873fb228a582d3a6ebf0cd4fa2ef59" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.968979 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9865-account-create-update-5xd7v" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.982588 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"7168af0bcba40eb18448198e217a748ba97ebaa8e8d4a6c817683c95c2ffa9a2"} Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.985027 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-xs4g5" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.985078 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-xs4g5" event={"ID":"45ad4533-c6a5-49da-8f33-23113f8b7fea","Type":"ContainerDied","Data":"29e7addf8e62cdc1322715c0687562f2bbdba871b07331508c1e9cd6c6eb1e35"} Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.985114 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29e7addf8e62cdc1322715c0687562f2bbdba871b07331508c1e9cd6c6eb1e35" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.995603 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-219b-account-create-update-c48ml" event={"ID":"46b707f0-c9cf-46b5-b615-4c0ab1da0391","Type":"ContainerDied","Data":"f41f8d7a37c5122fc63cc4eb313e0c73b2f929235be1be06f31ae30167258bf6"} Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.995656 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f41f8d7a37c5122fc63cc4eb313e0c73b2f929235be1be06f31ae30167258bf6" Feb 02 13:23:04 crc kubenswrapper[4721]: I0202 13:23:04.995695 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-219b-account-create-update-c48ml" Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.005965 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92mfj\" (UniqueName: \"kubernetes.io/projected/375b0aad-b921-41d8-af30-181ac4a73c0b-kube-api-access-92mfj\") pod \"375b0aad-b921-41d8-af30-181ac4a73c0b\" (UID: \"375b0aad-b921-41d8-af30-181ac4a73c0b\") " Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.006292 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/375b0aad-b921-41d8-af30-181ac4a73c0b-operator-scripts\") pod \"375b0aad-b921-41d8-af30-181ac4a73c0b\" (UID: \"375b0aad-b921-41d8-af30-181ac4a73c0b\") " Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.006994 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vckvd\" (UniqueName: \"kubernetes.io/projected/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-kube-api-access-vckvd\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.007020 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8m4n\" (UniqueName: \"kubernetes.io/projected/45ad4533-c6a5-49da-8f33-23113f8b7fea-kube-api-access-f8m4n\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.007034 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ad4533-c6a5-49da-8f33-23113f8b7fea-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.007047 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d74a2f0-9f60-4f59-92e4-11b9136f1db5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.007447 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/375b0aad-b921-41d8-af30-181ac4a73c0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "375b0aad-b921-41d8-af30-181ac4a73c0b" (UID: "375b0aad-b921-41d8-af30-181ac4a73c0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.021498 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375b0aad-b921-41d8-af30-181ac4a73c0b-kube-api-access-92mfj" (OuterVolumeSpecName: "kube-api-access-92mfj") pod "375b0aad-b921-41d8-af30-181ac4a73c0b" (UID: "375b0aad-b921-41d8-af30-181ac4a73c0b"). InnerVolumeSpecName "kube-api-access-92mfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.109773 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/375b0aad-b921-41d8-af30-181ac4a73c0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:05 crc kubenswrapper[4721]: I0202 13:23:05.109809 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92mfj\" (UniqueName: \"kubernetes.io/projected/375b0aad-b921-41d8-af30-181ac4a73c0b-kube-api-access-92mfj\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:07 crc kubenswrapper[4721]: I0202 13:23:07.021083 4721 generic.go:334] "Generic (PLEG): container finished" podID="6a34d077-087f-4b04-98c5-22e09450dcb3" containerID="d77970247eaad2de283a9d77316bf0b2e70ed8392f8f537a13730da5d5109f30" exitCode=0 Feb 02 13:23:07 crc kubenswrapper[4721]: I0202 13:23:07.021205 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6a34d077-087f-4b04-98c5-22e09450dcb3","Type":"ContainerDied","Data":"d77970247eaad2de283a9d77316bf0b2e70ed8392f8f537a13730da5d5109f30"} Feb 02 13:23:07 crc kubenswrapper[4721]: I0202 13:23:07.961027 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:08 crc kubenswrapper[4721]: I0202 13:23:08.047142 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ncb4h" event={"ID":"33682180-e2b1-4c20-a374-1a90e1ccea48","Type":"ContainerDied","Data":"b46df9c1fd68942af3a0d55f5470efc7009a80b9e946652e536d88fee6b65e76"} Feb 02 13:23:08 crc kubenswrapper[4721]: I0202 13:23:08.047185 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b46df9c1fd68942af3a0d55f5470efc7009a80b9e946652e536d88fee6b65e76" Feb 02 13:23:08 crc kubenswrapper[4721]: I0202 13:23:08.047228 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ncb4h" Feb 02 13:23:08 crc kubenswrapper[4721]: I0202 13:23:08.068946 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v98wl\" (UniqueName: \"kubernetes.io/projected/33682180-e2b1-4c20-a374-1a90e1ccea48-kube-api-access-v98wl\") pod \"33682180-e2b1-4c20-a374-1a90e1ccea48\" (UID: \"33682180-e2b1-4c20-a374-1a90e1ccea48\") " Feb 02 13:23:08 crc kubenswrapper[4721]: I0202 13:23:08.069027 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33682180-e2b1-4c20-a374-1a90e1ccea48-operator-scripts\") pod \"33682180-e2b1-4c20-a374-1a90e1ccea48\" (UID: \"33682180-e2b1-4c20-a374-1a90e1ccea48\") " Feb 02 13:23:08 crc kubenswrapper[4721]: I0202 13:23:08.070141 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33682180-e2b1-4c20-a374-1a90e1ccea48-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33682180-e2b1-4c20-a374-1a90e1ccea48" (UID: "33682180-e2b1-4c20-a374-1a90e1ccea48"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:08 crc kubenswrapper[4721]: I0202 13:23:08.074554 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33682180-e2b1-4c20-a374-1a90e1ccea48-kube-api-access-v98wl" (OuterVolumeSpecName: "kube-api-access-v98wl") pod "33682180-e2b1-4c20-a374-1a90e1ccea48" (UID: "33682180-e2b1-4c20-a374-1a90e1ccea48"). InnerVolumeSpecName "kube-api-access-v98wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:08 crc kubenswrapper[4721]: I0202 13:23:08.172240 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v98wl\" (UniqueName: \"kubernetes.io/projected/33682180-e2b1-4c20-a374-1a90e1ccea48-kube-api-access-v98wl\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:08 crc kubenswrapper[4721]: I0202 13:23:08.172276 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33682180-e2b1-4c20-a374-1a90e1ccea48-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:09 crc kubenswrapper[4721]: I0202 13:23:09.080124 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6a34d077-087f-4b04-98c5-22e09450dcb3","Type":"ContainerStarted","Data":"23c6cfe3194c224b23120bb9f19a38bd8ee66c14a40b76a6edd0e87659480c40"} Feb 02 13:23:09 crc kubenswrapper[4721]: I0202 13:23:09.083107 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wvp2n" event={"ID":"3395efa1-7b43-4b48-9e06-764b9428c5ab","Type":"ContainerStarted","Data":"3bcf52e41be39651ef275074658dfd7b224ff525c369aa709ad250ff99c12eb1"} Feb 02 13:23:09 crc kubenswrapper[4721]: I0202 13:23:09.106798 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"c41c7f6eec113dd73d809fc77c2060b653bcff6e3c1a843f75101eb9190fbc63"} Feb 02 13:23:09 crc kubenswrapper[4721]: I0202 13:23:09.116349 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-wvp2n" podStartSLOduration=2.6092471809999997 podStartE2EDuration="10.116329123s" podCreationTimestamp="2026-02-02 13:22:59 +0000 UTC" firstStartedPulling="2026-02-02 13:23:00.818689417 +0000 UTC m=+1321.121203806" lastFinishedPulling="2026-02-02 13:23:08.325771369 +0000 UTC m=+1328.628285748" observedRunningTime="2026-02-02 13:23:09.111922064 +0000 UTC m=+1329.414436453" watchObservedRunningTime="2026-02-02 13:23:09.116329123 +0000 UTC m=+1329.418843512" Feb 02 13:23:10 crc kubenswrapper[4721]: I0202 13:23:10.134380 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"722886dac37f34d4c44ff7250903a50d8823214ab95bbc2b6cbc259a35ffc1c0"} Feb 02 13:23:10 crc kubenswrapper[4721]: I0202 13:23:10.134899 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"5fcc5cd4aae3def8d9c518d7b5ab7a2ad186936d52ea53b6af7f7d6ce47ef8c6"} Feb 02 13:23:10 crc kubenswrapper[4721]: I0202 13:23:10.134918 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"991d8d80fab0ad41caf90be0e9e35be6d4e89e176ecbcad61b359ba57cf0ad62"} Feb 02 13:23:10 crc kubenswrapper[4721]: I0202 13:23:10.134933 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"62671d1d620f1f8547f0de767d3b54371754cd3be2a71c8ada676ab96a0c98de"} Feb 02 13:23:10 crc kubenswrapper[4721]: I0202 13:23:10.140184 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hrqtc" event={"ID":"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d","Type":"ContainerStarted","Data":"08a73cbae26287f30a607e9d3bb9b367097d0316dac19c52bb303a922febd87c"} Feb 02 13:23:10 crc kubenswrapper[4721]: I0202 13:23:10.165608 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hrqtc" podStartSLOduration=3.07168417 podStartE2EDuration="36.16558858s" podCreationTimestamp="2026-02-02 13:22:34 +0000 UTC" firstStartedPulling="2026-02-02 13:22:35.813096923 +0000 UTC m=+1296.115611312" lastFinishedPulling="2026-02-02 13:23:08.907001333 +0000 UTC m=+1329.209515722" observedRunningTime="2026-02-02 13:23:10.159508406 +0000 UTC m=+1330.462022795" watchObservedRunningTime="2026-02-02 13:23:10.16558858 +0000 UTC m=+1330.468102969" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.161052 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"36f269ef6126c36a0da5dca794d54147445850b21de602541b9b0d786d1c3590"} Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.162412 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"eabe6b07-da9d-4980-99b4-12c02640c88d","Type":"ContainerStarted","Data":"6cd48b9830efbc68918f9e68e64aa5bd7d26a8204ef9baf7243cb58a58c32b96"} Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.215652 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=38.089684651 podStartE2EDuration="50.215615019s" podCreationTimestamp="2026-02-02 13:22:21 +0000 UTC" firstStartedPulling="2026-02-02 13:22:56.199838181 +0000 UTC m=+1316.502352570" lastFinishedPulling="2026-02-02 13:23:08.325768549 +0000 UTC m=+1328.628282938" observedRunningTime="2026-02-02 13:23:11.202860294 +0000 UTC m=+1331.505374693" watchObservedRunningTime="2026-02-02 13:23:11.215615019 +0000 UTC m=+1331.518129408" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.594771 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kr74s"] Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595307 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51bc4821-8b8e-4972-a90e-67a7a7b1fee5" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595328 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="51bc4821-8b8e-4972-a90e-67a7a7b1fee5" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595353 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad06c3d-8212-4a61-b491-2af939014fd6" containerName="ovn-config" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595363 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad06c3d-8212-4a61-b491-2af939014fd6" containerName="ovn-config" Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595377 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33682180-e2b1-4c20-a374-1a90e1ccea48" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595385 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="33682180-e2b1-4c20-a374-1a90e1ccea48" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595397 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d74a2f0-9f60-4f59-92e4-11b9136f1db5" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595407 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d74a2f0-9f60-4f59-92e4-11b9136f1db5" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595425 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="937d142a-7868-4de2-85f3-90dcc5a74019" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595433 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="937d142a-7868-4de2-85f3-90dcc5a74019" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595448 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ad4533-c6a5-49da-8f33-23113f8b7fea" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595457 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ad4533-c6a5-49da-8f33-23113f8b7fea" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595474 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13666544-a226-43ee-84c9-3232e9fff8d4" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595482 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="13666544-a226-43ee-84c9-3232e9fff8d4" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595504 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b707f0-c9cf-46b5-b615-4c0ab1da0391" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595511 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b707f0-c9cf-46b5-b615-4c0ab1da0391" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595532 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f56b66-72ae-4c95-8051-dc5f7a0faec4" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595539 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f56b66-72ae-4c95-8051-dc5f7a0faec4" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: E0202 13:23:11.595548 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375b0aad-b921-41d8-af30-181ac4a73c0b" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595556 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="375b0aad-b921-41d8-af30-181ac4a73c0b" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595772 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d74a2f0-9f60-4f59-92e4-11b9136f1db5" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595784 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="51bc4821-8b8e-4972-a90e-67a7a7b1fee5" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595799 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f56b66-72ae-4c95-8051-dc5f7a0faec4" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595812 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b707f0-c9cf-46b5-b615-4c0ab1da0391" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595822 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ad4533-c6a5-49da-8f33-23113f8b7fea" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595834 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="937d142a-7868-4de2-85f3-90dcc5a74019" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595846 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="33682180-e2b1-4c20-a374-1a90e1ccea48" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595860 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="375b0aad-b921-41d8-af30-181ac4a73c0b" containerName="mariadb-account-create-update" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595873 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad06c3d-8212-4a61-b491-2af939014fd6" containerName="ovn-config" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.595885 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="13666544-a226-43ee-84c9-3232e9fff8d4" containerName="mariadb-database-create" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.597323 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.607298 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kr74s"] Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.608192 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.671806 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.672313 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5xtf\" (UniqueName: \"kubernetes.io/projected/3f27ccd0-68e0-47da-a813-83684a0b1787-kube-api-access-v5xtf\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.672716 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-config\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.672790 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-svc\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.672946 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.673003 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.775284 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-config\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.775366 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-svc\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.775410 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.775435 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.775491 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.775522 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5xtf\" (UniqueName: \"kubernetes.io/projected/3f27ccd0-68e0-47da-a813-83684a0b1787-kube-api-access-v5xtf\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.776540 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.776552 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-config\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.776712 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-svc\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.776731 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.777301 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:11 crc kubenswrapper[4721]: I0202 13:23:11.968550 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5xtf\" (UniqueName: \"kubernetes.io/projected/3f27ccd0-68e0-47da-a813-83684a0b1787-kube-api-access-v5xtf\") pod \"dnsmasq-dns-764c5664d7-kr74s\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:12 crc kubenswrapper[4721]: I0202 13:23:12.216919 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:12 crc kubenswrapper[4721]: W0202 13:23:12.725783 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f27ccd0_68e0_47da_a813_83684a0b1787.slice/crio-cb0e1c2486d91a6a369eb95b549c6f528badf3dc4c8e1256da54b5e5a3c88cfd WatchSource:0}: Error finding container cb0e1c2486d91a6a369eb95b549c6f528badf3dc4c8e1256da54b5e5a3c88cfd: Status 404 returned error can't find the container with id cb0e1c2486d91a6a369eb95b549c6f528badf3dc4c8e1256da54b5e5a3c88cfd Feb 02 13:23:12 crc kubenswrapper[4721]: I0202 13:23:12.725868 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kr74s"] Feb 02 13:23:12 crc kubenswrapper[4721]: I0202 13:23:12.840043 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ncb4h"] Feb 02 13:23:12 crc kubenswrapper[4721]: I0202 13:23:12.849025 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ncb4h"] Feb 02 13:23:13 crc kubenswrapper[4721]: I0202 13:23:13.194000 4721 generic.go:334] "Generic (PLEG): container finished" podID="3f27ccd0-68e0-47da-a813-83684a0b1787" containerID="cf43302a7f3026dcc314dce1f4dfee24a1b2e493151bf89876aef92fa2f944d8" exitCode=0 Feb 02 13:23:13 crc kubenswrapper[4721]: I0202 13:23:13.194222 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" event={"ID":"3f27ccd0-68e0-47da-a813-83684a0b1787","Type":"ContainerDied","Data":"cf43302a7f3026dcc314dce1f4dfee24a1b2e493151bf89876aef92fa2f944d8"} Feb 02 13:23:13 crc kubenswrapper[4721]: I0202 13:23:13.194271 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" event={"ID":"3f27ccd0-68e0-47da-a813-83684a0b1787","Type":"ContainerStarted","Data":"cb0e1c2486d91a6a369eb95b549c6f528badf3dc4c8e1256da54b5e5a3c88cfd"} Feb 02 13:23:13 crc kubenswrapper[4721]: I0202 13:23:13.202406 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6a34d077-087f-4b04-98c5-22e09450dcb3","Type":"ContainerStarted","Data":"00cace9833aaa02756c80cc7ba5c1b7b04be74037fc31aabefc226265bb46bd7"} Feb 02 13:23:13 crc kubenswrapper[4721]: I0202 13:23:13.202459 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6a34d077-087f-4b04-98c5-22e09450dcb3","Type":"ContainerStarted","Data":"62357bb4f0d496ffbef788e049aa3d489bf0f7675e4270d9738c7403ce7db9f5"} Feb 02 13:23:13 crc kubenswrapper[4721]: I0202 13:23:13.267160 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.267138734 podStartE2EDuration="19.267138734s" podCreationTimestamp="2026-02-02 13:22:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:13.25959828 +0000 UTC m=+1333.562112679" watchObservedRunningTime="2026-02-02 13:23:13.267138734 +0000 UTC m=+1333.569653123" Feb 02 13:23:14 crc kubenswrapper[4721]: I0202 13:23:14.214360 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" event={"ID":"3f27ccd0-68e0-47da-a813-83684a0b1787","Type":"ContainerStarted","Data":"69ee8084b0c9e06457f722be7465dbe0e100403673683c49147169903cf21344"} Feb 02 13:23:14 crc kubenswrapper[4721]: I0202 13:23:14.234234 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" podStartSLOduration=3.2342083 podStartE2EDuration="3.2342083s" podCreationTimestamp="2026-02-02 13:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:14.230357706 +0000 UTC m=+1334.532872095" watchObservedRunningTime="2026-02-02 13:23:14.2342083 +0000 UTC m=+1334.536722689" Feb 02 13:23:14 crc kubenswrapper[4721]: I0202 13:23:14.423510 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33682180-e2b1-4c20-a374-1a90e1ccea48" path="/var/lib/kubelet/pods/33682180-e2b1-4c20-a374-1a90e1ccea48/volumes" Feb 02 13:23:14 crc kubenswrapper[4721]: I0202 13:23:14.706200 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 13:23:15 crc kubenswrapper[4721]: I0202 13:23:15.223533 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:16 crc kubenswrapper[4721]: I0202 13:23:16.234633 4721 generic.go:334] "Generic (PLEG): container finished" podID="3395efa1-7b43-4b48-9e06-764b9428c5ab" containerID="3bcf52e41be39651ef275074658dfd7b224ff525c369aa709ad250ff99c12eb1" exitCode=0 Feb 02 13:23:16 crc kubenswrapper[4721]: I0202 13:23:16.234737 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wvp2n" event={"ID":"3395efa1-7b43-4b48-9e06-764b9428c5ab","Type":"ContainerDied","Data":"3bcf52e41be39651ef275074658dfd7b224ff525c369aa709ad250ff99c12eb1"} Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.747975 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.870164 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-h675n"] Feb 02 13:23:17 crc kubenswrapper[4721]: E0202 13:23:17.870626 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3395efa1-7b43-4b48-9e06-764b9428c5ab" containerName="keystone-db-sync" Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.870644 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3395efa1-7b43-4b48-9e06-764b9428c5ab" containerName="keystone-db-sync" Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.870857 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3395efa1-7b43-4b48-9e06-764b9428c5ab" containerName="keystone-db-sync" Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.871547 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h675n" Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.876745 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.882144 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h675n"] Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.912575 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppjqv\" (UniqueName: \"kubernetes.io/projected/3395efa1-7b43-4b48-9e06-764b9428c5ab-kube-api-access-ppjqv\") pod \"3395efa1-7b43-4b48-9e06-764b9428c5ab\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.912690 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-config-data\") pod \"3395efa1-7b43-4b48-9e06-764b9428c5ab\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.912807 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-combined-ca-bundle\") pod \"3395efa1-7b43-4b48-9e06-764b9428c5ab\" (UID: \"3395efa1-7b43-4b48-9e06-764b9428c5ab\") " Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.929500 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3395efa1-7b43-4b48-9e06-764b9428c5ab-kube-api-access-ppjqv" (OuterVolumeSpecName: "kube-api-access-ppjqv") pod "3395efa1-7b43-4b48-9e06-764b9428c5ab" (UID: "3395efa1-7b43-4b48-9e06-764b9428c5ab"). InnerVolumeSpecName "kube-api-access-ppjqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.949134 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3395efa1-7b43-4b48-9e06-764b9428c5ab" (UID: "3395efa1-7b43-4b48-9e06-764b9428c5ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:17 crc kubenswrapper[4721]: I0202 13:23:17.970900 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-config-data" (OuterVolumeSpecName: "config-data") pod "3395efa1-7b43-4b48-9e06-764b9428c5ab" (UID: "3395efa1-7b43-4b48-9e06-764b9428c5ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.016623 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvhpv\" (UniqueName: \"kubernetes.io/projected/71ef45b1-9ff2-40ca-950a-07746f51eca9-kube-api-access-rvhpv\") pod \"root-account-create-update-h675n\" (UID: \"71ef45b1-9ff2-40ca-950a-07746f51eca9\") " pod="openstack/root-account-create-update-h675n" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.016811 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ef45b1-9ff2-40ca-950a-07746f51eca9-operator-scripts\") pod \"root-account-create-update-h675n\" (UID: \"71ef45b1-9ff2-40ca-950a-07746f51eca9\") " pod="openstack/root-account-create-update-h675n" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.017195 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.017230 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3395efa1-7b43-4b48-9e06-764b9428c5ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.017246 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppjqv\" (UniqueName: \"kubernetes.io/projected/3395efa1-7b43-4b48-9e06-764b9428c5ab-kube-api-access-ppjqv\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.118480 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvhpv\" (UniqueName: \"kubernetes.io/projected/71ef45b1-9ff2-40ca-950a-07746f51eca9-kube-api-access-rvhpv\") pod \"root-account-create-update-h675n\" (UID: \"71ef45b1-9ff2-40ca-950a-07746f51eca9\") " pod="openstack/root-account-create-update-h675n" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.118570 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ef45b1-9ff2-40ca-950a-07746f51eca9-operator-scripts\") pod \"root-account-create-update-h675n\" (UID: \"71ef45b1-9ff2-40ca-950a-07746f51eca9\") " pod="openstack/root-account-create-update-h675n" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.119510 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ef45b1-9ff2-40ca-950a-07746f51eca9-operator-scripts\") pod \"root-account-create-update-h675n\" (UID: \"71ef45b1-9ff2-40ca-950a-07746f51eca9\") " pod="openstack/root-account-create-update-h675n" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.137700 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvhpv\" (UniqueName: \"kubernetes.io/projected/71ef45b1-9ff2-40ca-950a-07746f51eca9-kube-api-access-rvhpv\") pod \"root-account-create-update-h675n\" (UID: \"71ef45b1-9ff2-40ca-950a-07746f51eca9\") " pod="openstack/root-account-create-update-h675n" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.185996 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h675n" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.256884 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-wvp2n" event={"ID":"3395efa1-7b43-4b48-9e06-764b9428c5ab","Type":"ContainerDied","Data":"9a4595f760619b3ba238fcbf4e59500ea2d92f85648541e4ad855bfe4ec9d3fa"} Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.256940 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a4595f760619b3ba238fcbf4e59500ea2d92f85648541e4ad855bfe4ec9d3fa" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.257011 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-wvp2n" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.535570 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kr74s"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.536196 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" podUID="3f27ccd0-68e0-47da-a813-83684a0b1787" containerName="dnsmasq-dns" containerID="cri-o://69ee8084b0c9e06457f722be7465dbe0e100403673683c49147169903cf21344" gracePeriod=10 Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.540491 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.625200 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-htxgd"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.627531 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.646140 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-l7z2q"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.647685 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.650505 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.650695 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7z9s7" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.650853 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.651143 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.651303 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.656885 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-htxgd"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.729463 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l7z2q"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.735783 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-combined-ca-bundle\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.741470 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-scripts\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.741527 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-config\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.741613 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-config-data\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.741683 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-fernet-keys\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.741730 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.741774 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.742043 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-svc\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.742306 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-credential-keys\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.742363 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.742410 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzqjs\" (UniqueName: \"kubernetes.io/projected/af567124-fd0c-420e-b79b-41e8a7140cef-kube-api-access-hzqjs\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.742458 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2trvt\" (UniqueName: \"kubernetes.io/projected/42cb85a8-831b-4a92-936b-d79276a2d1e5-kube-api-access-2trvt\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.757019 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-n52pp"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.760382 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.763846 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.764246 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-ldgvp" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.770780 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-n52pp"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.837153 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h675n"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.858254 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.858343 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.858444 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-combined-ca-bundle\") pod \"heat-db-sync-n52pp\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.858595 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-svc\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.858739 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-credential-keys\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.858806 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.858832 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-config-data\") pod \"heat-db-sync-n52pp\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.858885 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzqjs\" (UniqueName: \"kubernetes.io/projected/af567124-fd0c-420e-b79b-41e8a7140cef-kube-api-access-hzqjs\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.858927 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2trvt\" (UniqueName: \"kubernetes.io/projected/42cb85a8-831b-4a92-936b-d79276a2d1e5-kube-api-access-2trvt\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.859032 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-combined-ca-bundle\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.859087 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ct7f\" (UniqueName: \"kubernetes.io/projected/9fa244a8-7588-4d87-bd5b-cbcd10780c83-kube-api-access-2ct7f\") pod \"heat-db-sync-n52pp\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.859131 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-scripts\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.859164 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-config\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.859238 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-config-data\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.859300 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-fernet-keys\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.860409 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.869710 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-config\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.878652 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.879589 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-svc\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.891239 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.911649 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-scripts\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.913694 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-combined-ca-bundle\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.914056 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-fernet-keys\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.916565 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-2tnbk"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.919311 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.929391 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4l6jw" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.929662 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.929817 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.932727 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-config-data\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.936762 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-credential-keys\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.955279 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2trvt\" (UniqueName: \"kubernetes.io/projected/42cb85a8-831b-4a92-936b-d79276a2d1e5-kube-api-access-2trvt\") pod \"dnsmasq-dns-5959f8865f-htxgd\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.962511 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-config-data\") pod \"heat-db-sync-n52pp\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.962779 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ct7f\" (UniqueName: \"kubernetes.io/projected/9fa244a8-7588-4d87-bd5b-cbcd10780c83-kube-api-access-2ct7f\") pod \"heat-db-sync-n52pp\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.962955 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-combined-ca-bundle\") pod \"heat-db-sync-n52pp\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.973027 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-7wjxh"] Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.984741 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-config-data\") pod \"heat-db-sync-n52pp\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.995726 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzqjs\" (UniqueName: \"kubernetes.io/projected/af567124-fd0c-420e-b79b-41e8a7140cef-kube-api-access-hzqjs\") pod \"keystone-bootstrap-l7z2q\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:18 crc kubenswrapper[4721]: I0202 13:23:18.996236 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:18.998904 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.003637 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v76dv" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.003749 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.004897 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-combined-ca-bundle\") pod \"heat-db-sync-n52pp\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.038923 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7wjxh"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.045356 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ct7f\" (UniqueName: \"kubernetes.io/projected/9fa244a8-7588-4d87-bd5b-cbcd10780c83-kube-api-access-2ct7f\") pod \"heat-db-sync-n52pp\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.061177 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-2tnbk"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.061485 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.066786 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-combined-ca-bundle\") pod \"neutron-db-sync-2tnbk\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.066856 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4b6x\" (UniqueName: \"kubernetes.io/projected/ad3578ef-5d1b-4c52-939c-237feadc1c5c-kube-api-access-c4b6x\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.066918 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3578ef-5d1b-4c52-939c-237feadc1c5c-etc-machine-id\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.067038 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-db-sync-config-data\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.067089 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-config-data\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.067122 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-scripts\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.070285 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kszgd\" (UniqueName: \"kubernetes.io/projected/026bbe7a-aec9-40ee-9be3-cdb35054e076-kube-api-access-kszgd\") pod \"neutron-db-sync-2tnbk\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.070376 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-config\") pod \"neutron-db-sync-2tnbk\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.070522 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-combined-ca-bundle\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.148695 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.166029 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-86z2v"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.168350 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.173117 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-combined-ca-bundle\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.173198 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-combined-ca-bundle\") pod \"neutron-db-sync-2tnbk\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.173236 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4b6x\" (UniqueName: \"kubernetes.io/projected/ad3578ef-5d1b-4c52-939c-237feadc1c5c-kube-api-access-c4b6x\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.173294 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3578ef-5d1b-4c52-939c-237feadc1c5c-etc-machine-id\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.173422 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-db-sync-config-data\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.173469 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-config-data\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.173518 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-scripts\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.173556 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kszgd\" (UniqueName: \"kubernetes.io/projected/026bbe7a-aec9-40ee-9be3-cdb35054e076-kube-api-access-kszgd\") pod \"neutron-db-sync-2tnbk\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.173594 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-config\") pod \"neutron-db-sync-2tnbk\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.185533 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.186008 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tfr2p" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.186343 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.188131 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-htxgd"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.191608 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3578ef-5d1b-4c52-939c-237feadc1c5c-etc-machine-id\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.219784 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-db-sync-config-data\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.222186 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-86z2v"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.222621 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-scripts\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.225532 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-config-data\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.226240 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-combined-ca-bundle\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.234502 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-config\") pod \"neutron-db-sync-2tnbk\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.235185 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-combined-ca-bundle\") pod \"neutron-db-sync-2tnbk\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.236844 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4b6x\" (UniqueName: \"kubernetes.io/projected/ad3578ef-5d1b-4c52-939c-237feadc1c5c-kube-api-access-c4b6x\") pod \"cinder-db-sync-7wjxh\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.249421 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f8lxs"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.258655 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kszgd\" (UniqueName: \"kubernetes.io/projected/026bbe7a-aec9-40ee-9be3-cdb35054e076-kube-api-access-kszgd\") pod \"neutron-db-sync-2tnbk\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.263392 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.283812 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f8lxs"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.300912 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h675n" event={"ID":"71ef45b1-9ff2-40ca-950a-07746f51eca9","Type":"ContainerStarted","Data":"3cbb26dd329c92faeeafbf64445c9a0bfc1db825e69815225692a1b41aaa4b51"} Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.303297 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8kv\" (UniqueName: \"kubernetes.io/projected/bdd67c16-7130-4095-952f-006aa5bcd5bb-kube-api-access-fc8kv\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.303433 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-combined-ca-bundle\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.303459 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-config-data\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.303496 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-scripts\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.303718 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdd67c16-7130-4095-952f-006aa5bcd5bb-logs\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.331408 4721 generic.go:334] "Generic (PLEG): container finished" podID="3f27ccd0-68e0-47da-a813-83684a0b1787" containerID="69ee8084b0c9e06457f722be7465dbe0e100403673683c49147169903cf21344" exitCode=0 Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.331485 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-cgqfl"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.333260 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" event={"ID":"3f27ccd0-68e0-47da-a813-83684a0b1787","Type":"ContainerDied","Data":"69ee8084b0c9e06457f722be7465dbe0e100403673683c49147169903cf21344"} Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.333377 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.342618 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-27rl5" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.342898 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.359977 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cgqfl"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407522 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-combined-ca-bundle\") pod \"barbican-db-sync-cgqfl\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407563 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407607 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdd67c16-7130-4095-952f-006aa5bcd5bb-logs\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407629 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407650 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t64pt\" (UniqueName: \"kubernetes.io/projected/42af4b6d-a3ac-4a90-8338-71dcdba65713-kube-api-access-t64pt\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407687 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-db-sync-config-data\") pod \"barbican-db-sync-cgqfl\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407708 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-config\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407733 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407773 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8kv\" (UniqueName: \"kubernetes.io/projected/bdd67c16-7130-4095-952f-006aa5bcd5bb-kube-api-access-fc8kv\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407818 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-combined-ca-bundle\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407836 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-config-data\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407862 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-scripts\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407897 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-777ht\" (UniqueName: \"kubernetes.io/projected/47a4176b-5f58-47a9-a614-e5d05526da18-kube-api-access-777ht\") pod \"barbican-db-sync-cgqfl\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.407923 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.408450 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdd67c16-7130-4095-952f-006aa5bcd5bb-logs\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.426899 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-scripts\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.430153 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-combined-ca-bundle\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.432124 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-config-data\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.436999 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8kv\" (UniqueName: \"kubernetes.io/projected/bdd67c16-7130-4095-952f-006aa5bcd5bb-kube-api-access-fc8kv\") pod \"placement-db-sync-86z2v\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.512327 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-combined-ca-bundle\") pod \"barbican-db-sync-cgqfl\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.512380 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.512424 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.512445 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t64pt\" (UniqueName: \"kubernetes.io/projected/42af4b6d-a3ac-4a90-8338-71dcdba65713-kube-api-access-t64pt\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.512474 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-db-sync-config-data\") pod \"barbican-db-sync-cgqfl\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.512504 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-config\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.512533 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.512648 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-777ht\" (UniqueName: \"kubernetes.io/projected/47a4176b-5f58-47a9-a614-e5d05526da18-kube-api-access-777ht\") pod \"barbican-db-sync-cgqfl\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.512682 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.513744 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.513771 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.513956 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-config\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.514773 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.525567 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.539004 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-combined-ca-bundle\") pod \"barbican-db-sync-cgqfl\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.541643 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-db-sync-config-data\") pod \"barbican-db-sync-cgqfl\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.564601 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t64pt\" (UniqueName: \"kubernetes.io/projected/42af4b6d-a3ac-4a90-8338-71dcdba65713-kube-api-access-t64pt\") pod \"dnsmasq-dns-58dd9ff6bc-f8lxs\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.568689 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-n52pp" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.570801 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-777ht\" (UniqueName: \"kubernetes.io/projected/47a4176b-5f58-47a9-a614-e5d05526da18-kube-api-access-777ht\") pod \"barbican-db-sync-cgqfl\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.579298 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.592596 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.607721 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.630415 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.775963 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.780399 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.785562 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.799249 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.804634 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.927426 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.927917 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-scripts\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.927956 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-log-httpd\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.927999 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbbzr\" (UniqueName: \"kubernetes.io/projected/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-kube-api-access-fbbzr\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.928051 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.928135 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-run-httpd\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.928285 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-config-data\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:19 crc kubenswrapper[4721]: I0202 13:23:19.959192 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.032036 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-config-data\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.032140 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.032195 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-scripts\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.032215 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-log-httpd\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.032245 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbbzr\" (UniqueName: \"kubernetes.io/projected/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-kube-api-access-fbbzr\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.032270 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.032332 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-run-httpd\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.033095 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.041990 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-run-httpd\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.048620 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-log-httpd\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.068224 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-config-data\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.104105 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.106103 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-scripts\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.123009 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.136142 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5xtf\" (UniqueName: \"kubernetes.io/projected/3f27ccd0-68e0-47da-a813-83684a0b1787-kube-api-access-v5xtf\") pod \"3f27ccd0-68e0-47da-a813-83684a0b1787\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.136376 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-config\") pod \"3f27ccd0-68e0-47da-a813-83684a0b1787\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.138198 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-swift-storage-0\") pod \"3f27ccd0-68e0-47da-a813-83684a0b1787\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.138237 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-nb\") pod \"3f27ccd0-68e0-47da-a813-83684a0b1787\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.139913 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbbzr\" (UniqueName: \"kubernetes.io/projected/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-kube-api-access-fbbzr\") pod \"ceilometer-0\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.139931 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-svc\") pod \"3f27ccd0-68e0-47da-a813-83684a0b1787\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.140216 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-sb\") pod \"3f27ccd0-68e0-47da-a813-83684a0b1787\" (UID: \"3f27ccd0-68e0-47da-a813-83684a0b1787\") " Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.155882 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.162472 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f27ccd0-68e0-47da-a813-83684a0b1787-kube-api-access-v5xtf" (OuterVolumeSpecName: "kube-api-access-v5xtf") pod "3f27ccd0-68e0-47da-a813-83684a0b1787" (UID: "3f27ccd0-68e0-47da-a813-83684a0b1787"). InnerVolumeSpecName "kube-api-access-v5xtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.264704 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-l7z2q"] Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.268312 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5xtf\" (UniqueName: \"kubernetes.io/projected/3f27ccd0-68e0-47da-a813-83684a0b1787-kube-api-access-v5xtf\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.475114 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-config" (OuterVolumeSpecName: "config") pod "3f27ccd0-68e0-47da-a813-83684a0b1787" (UID: "3f27ccd0-68e0-47da-a813-83684a0b1787"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.478033 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.488271 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.513983 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f27ccd0-68e0-47da-a813-83684a0b1787" (UID: "3f27ccd0-68e0-47da-a813-83684a0b1787"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.569288 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f27ccd0-68e0-47da-a813-83684a0b1787" (UID: "3f27ccd0-68e0-47da-a813-83684a0b1787"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.569937 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3f27ccd0-68e0-47da-a813-83684a0b1787" (UID: "3f27ccd0-68e0-47da-a813-83684a0b1787"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.600207 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.600259 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.600275 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.643427 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kr74s" event={"ID":"3f27ccd0-68e0-47da-a813-83684a0b1787","Type":"ContainerDied","Data":"cb0e1c2486d91a6a369eb95b549c6f528badf3dc4c8e1256da54b5e5a3c88cfd"} Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.643479 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l7z2q" event={"ID":"af567124-fd0c-420e-b79b-41e8a7140cef","Type":"ContainerStarted","Data":"0b6290d6ebded27a4f316246821ca1ecc86a05d55f3135936f20ff3e7cfb9766"} Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.643496 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h675n" event={"ID":"71ef45b1-9ff2-40ca-950a-07746f51eca9","Type":"ContainerStarted","Data":"4fe33e5a77c87b36e6e67a1b771c77b2a949e44a54e8c2f88295e47c2b68d215"} Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.643544 4721 scope.go:117] "RemoveContainer" containerID="69ee8084b0c9e06457f722be7465dbe0e100403673683c49147169903cf21344" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.705900 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-htxgd"] Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.827535 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3f27ccd0-68e0-47da-a813-83684a0b1787" (UID: "3f27ccd0-68e0-47da-a813-83684a0b1787"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.830848 4721 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3f27ccd0-68e0-47da-a813-83684a0b1787-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.875131 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-h675n" podStartSLOduration=3.875044902 podStartE2EDuration="3.875044902s" podCreationTimestamp="2026-02-02 13:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:20.62786265 +0000 UTC m=+1340.930377039" watchObservedRunningTime="2026-02-02 13:23:20.875044902 +0000 UTC m=+1341.177559311" Feb 02 13:23:20 crc kubenswrapper[4721]: I0202 13:23:20.904456 4721 scope.go:117] "RemoveContainer" containerID="cf43302a7f3026dcc314dce1f4dfee24a1b2e493151bf89876aef92fa2f944d8" Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.153373 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kr74s"] Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.177368 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kr74s"] Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.213552 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-2tnbk"] Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.313515 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-86z2v"] Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.335171 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-7wjxh"] Feb 02 13:23:21 crc kubenswrapper[4721]: W0202 13:23:21.380334 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad3578ef_5d1b_4c52_939c_237feadc1c5c.slice/crio-85181cad5536c1551bbbb525ba3a357b7e769080b51a0a4639fbdd8c37e6d7bd WatchSource:0}: Error finding container 85181cad5536c1551bbbb525ba3a357b7e769080b51a0a4639fbdd8c37e6d7bd: Status 404 returned error can't find the container with id 85181cad5536c1551bbbb525ba3a357b7e769080b51a0a4639fbdd8c37e6d7bd Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.618060 4721 generic.go:334] "Generic (PLEG): container finished" podID="71ef45b1-9ff2-40ca-950a-07746f51eca9" containerID="4fe33e5a77c87b36e6e67a1b771c77b2a949e44a54e8c2f88295e47c2b68d215" exitCode=0 Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.618422 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h675n" event={"ID":"71ef45b1-9ff2-40ca-950a-07746f51eca9","Type":"ContainerDied","Data":"4fe33e5a77c87b36e6e67a1b771c77b2a949e44a54e8c2f88295e47c2b68d215"} Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.662625 4721 generic.go:334] "Generic (PLEG): container finished" podID="42cb85a8-831b-4a92-936b-d79276a2d1e5" containerID="23e7e8b91a2ef6d2365bf404f515402fb59b28d83b6dfc69437b9e9833746052" exitCode=0 Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.662738 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-htxgd" event={"ID":"42cb85a8-831b-4a92-936b-d79276a2d1e5","Type":"ContainerDied","Data":"23e7e8b91a2ef6d2365bf404f515402fb59b28d83b6dfc69437b9e9833746052"} Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.662772 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-htxgd" event={"ID":"42cb85a8-831b-4a92-936b-d79276a2d1e5","Type":"ContainerStarted","Data":"02de98de50d044af23bd1e24a46d0eb4f77865743e664633928ca4a36e2c50f9"} Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.671867 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2tnbk" event={"ID":"026bbe7a-aec9-40ee-9be3-cdb35054e076","Type":"ContainerStarted","Data":"3095e5ccc72506309a2adf6e3ceae18778d2c5f94dff5364525934f800f177aa"} Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.673403 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-n52pp"] Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.675042 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l7z2q" event={"ID":"af567124-fd0c-420e-b79b-41e8a7140cef","Type":"ContainerStarted","Data":"59e47087e25d7a69cc9b0e24b51c0193c1d130de3a6fbb82bf929574bc9e38b6"} Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.695948 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-86z2v" event={"ID":"bdd67c16-7130-4095-952f-006aa5bcd5bb","Type":"ContainerStarted","Data":"42fcaa03785c97d077f0697f3384b2a0825c5d2c67e774961ea3ccb33c705ea9"} Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.698875 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7wjxh" event={"ID":"ad3578ef-5d1b-4c52-939c-237feadc1c5c","Type":"ContainerStarted","Data":"85181cad5536c1551bbbb525ba3a357b7e769080b51a0a4639fbdd8c37e6d7bd"} Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.703103 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:23:21 crc kubenswrapper[4721]: W0202 13:23:21.732514 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47a4176b_5f58_47a9_a614_e5d05526da18.slice/crio-0e87512b76ceddbcd36259192a7d06b2c50b8fc83a99c2a6ecdeee04f9de5d79 WatchSource:0}: Error finding container 0e87512b76ceddbcd36259192a7d06b2c50b8fc83a99c2a6ecdeee04f9de5d79: Status 404 returned error can't find the container with id 0e87512b76ceddbcd36259192a7d06b2c50b8fc83a99c2a6ecdeee04f9de5d79 Feb 02 13:23:21 crc kubenswrapper[4721]: W0202 13:23:21.756921 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42af4b6d_a3ac_4a90_8338_71dcdba65713.slice/crio-456048e529dbcb77018460d463790172fa02060ea250e5dff1aa993f12b0810c WatchSource:0}: Error finding container 456048e529dbcb77018460d463790172fa02060ea250e5dff1aa993f12b0810c: Status 404 returned error can't find the container with id 456048e529dbcb77018460d463790172fa02060ea250e5dff1aa993f12b0810c Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.781865 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cgqfl"] Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.806618 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f8lxs"] Feb 02 13:23:21 crc kubenswrapper[4721]: I0202 13:23:21.812098 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-l7z2q" podStartSLOduration=3.812050175 podStartE2EDuration="3.812050175s" podCreationTimestamp="2026-02-02 13:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:21.721541539 +0000 UTC m=+1342.024055938" watchObservedRunningTime="2026-02-02 13:23:21.812050175 +0000 UTC m=+1342.114564564" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.175987 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.235409 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-swift-storage-0\") pod \"42cb85a8-831b-4a92-936b-d79276a2d1e5\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.235462 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-nb\") pod \"42cb85a8-831b-4a92-936b-d79276a2d1e5\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.235486 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-sb\") pod \"42cb85a8-831b-4a92-936b-d79276a2d1e5\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.235597 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2trvt\" (UniqueName: \"kubernetes.io/projected/42cb85a8-831b-4a92-936b-d79276a2d1e5-kube-api-access-2trvt\") pod \"42cb85a8-831b-4a92-936b-d79276a2d1e5\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.235622 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-config\") pod \"42cb85a8-831b-4a92-936b-d79276a2d1e5\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.235693 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-svc\") pod \"42cb85a8-831b-4a92-936b-d79276a2d1e5\" (UID: \"42cb85a8-831b-4a92-936b-d79276a2d1e5\") " Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.247237 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cb85a8-831b-4a92-936b-d79276a2d1e5-kube-api-access-2trvt" (OuterVolumeSpecName: "kube-api-access-2trvt") pod "42cb85a8-831b-4a92-936b-d79276a2d1e5" (UID: "42cb85a8-831b-4a92-936b-d79276a2d1e5"). InnerVolumeSpecName "kube-api-access-2trvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.281667 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42cb85a8-831b-4a92-936b-d79276a2d1e5" (UID: "42cb85a8-831b-4a92-936b-d79276a2d1e5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.313648 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-config" (OuterVolumeSpecName: "config") pod "42cb85a8-831b-4a92-936b-d79276a2d1e5" (UID: "42cb85a8-831b-4a92-936b-d79276a2d1e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.315663 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "42cb85a8-831b-4a92-936b-d79276a2d1e5" (UID: "42cb85a8-831b-4a92-936b-d79276a2d1e5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.317839 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42cb85a8-831b-4a92-936b-d79276a2d1e5" (UID: "42cb85a8-831b-4a92-936b-d79276a2d1e5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.324803 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42cb85a8-831b-4a92-936b-d79276a2d1e5" (UID: "42cb85a8-831b-4a92-936b-d79276a2d1e5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.343575 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2trvt\" (UniqueName: \"kubernetes.io/projected/42cb85a8-831b-4a92-936b-d79276a2d1e5-kube-api-access-2trvt\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.343609 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.343619 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.343631 4721 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.343642 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.343650 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42cb85a8-831b-4a92-936b-d79276a2d1e5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.436618 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f27ccd0-68e0-47da-a813-83684a0b1787" path="/var/lib/kubelet/pods/3f27ccd0-68e0-47da-a813-83684a0b1787/volumes" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.438019 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.717454 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-htxgd" event={"ID":"42cb85a8-831b-4a92-936b-d79276a2d1e5","Type":"ContainerDied","Data":"02de98de50d044af23bd1e24a46d0eb4f77865743e664633928ca4a36e2c50f9"} Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.717772 4721 scope.go:117] "RemoveContainer" containerID="23e7e8b91a2ef6d2365bf404f515402fb59b28d83b6dfc69437b9e9833746052" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.717612 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-htxgd" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.722712 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2tnbk" event={"ID":"026bbe7a-aec9-40ee-9be3-cdb35054e076","Type":"ContainerStarted","Data":"759c834b6a1cc62188124483c6831d2bab037f76c9aac624de4118e2066fe35a"} Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.730731 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cgqfl" event={"ID":"47a4176b-5f58-47a9-a614-e5d05526da18","Type":"ContainerStarted","Data":"0e87512b76ceddbcd36259192a7d06b2c50b8fc83a99c2a6ecdeee04f9de5d79"} Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.734219 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-n52pp" event={"ID":"9fa244a8-7588-4d87-bd5b-cbcd10780c83","Type":"ContainerStarted","Data":"f8c8f50f7ba68fd272f974a962b3d958bc797bb63b6619f9ead2e2ffc4525a32"} Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.737963 4721 generic.go:334] "Generic (PLEG): container finished" podID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerID="5e753861212a75a6fc0b9472a29c53a493e744b8cbd4bb8e6b1dda52762f0e28" exitCode=0 Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.738035 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" event={"ID":"42af4b6d-a3ac-4a90-8338-71dcdba65713","Type":"ContainerDied","Data":"5e753861212a75a6fc0b9472a29c53a493e744b8cbd4bb8e6b1dda52762f0e28"} Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.738145 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" event={"ID":"42af4b6d-a3ac-4a90-8338-71dcdba65713","Type":"ContainerStarted","Data":"456048e529dbcb77018460d463790172fa02060ea250e5dff1aa993f12b0810c"} Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.744734 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-2tnbk" podStartSLOduration=4.74471589 podStartE2EDuration="4.74471589s" podCreationTimestamp="2026-02-02 13:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:22.744007472 +0000 UTC m=+1343.046521861" watchObservedRunningTime="2026-02-02 13:23:22.74471589 +0000 UTC m=+1343.047230279" Feb 02 13:23:22 crc kubenswrapper[4721]: I0202 13:23:22.753419 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5","Type":"ContainerStarted","Data":"5372b8d7305b4393c88e00abc4c50b4b02eb1dd6564a279f35710dd3d18e6691"} Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.039146 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-htxgd"] Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.064616 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-htxgd"] Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.774900 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h675n" Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.779240 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h675n" event={"ID":"71ef45b1-9ff2-40ca-950a-07746f51eca9","Type":"ContainerDied","Data":"3cbb26dd329c92faeeafbf64445c9a0bfc1db825e69815225692a1b41aaa4b51"} Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.779298 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cbb26dd329c92faeeafbf64445c9a0bfc1db825e69815225692a1b41aaa4b51" Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.802415 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" event={"ID":"42af4b6d-a3ac-4a90-8338-71dcdba65713","Type":"ContainerStarted","Data":"bc03b54bbb02a10feb6c590756e051033c53cca1bb5f2262c3ad8498b1f79580"} Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.802985 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.831143 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" podStartSLOduration=4.831126073 podStartE2EDuration="4.831126073s" podCreationTimestamp="2026-02-02 13:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:23.818229814 +0000 UTC m=+1344.120744213" watchObservedRunningTime="2026-02-02 13:23:23.831126073 +0000 UTC m=+1344.133640462" Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.919370 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ef45b1-9ff2-40ca-950a-07746f51eca9-operator-scripts\") pod \"71ef45b1-9ff2-40ca-950a-07746f51eca9\" (UID: \"71ef45b1-9ff2-40ca-950a-07746f51eca9\") " Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.919736 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvhpv\" (UniqueName: \"kubernetes.io/projected/71ef45b1-9ff2-40ca-950a-07746f51eca9-kube-api-access-rvhpv\") pod \"71ef45b1-9ff2-40ca-950a-07746f51eca9\" (UID: \"71ef45b1-9ff2-40ca-950a-07746f51eca9\") " Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.920930 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ef45b1-9ff2-40ca-950a-07746f51eca9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71ef45b1-9ff2-40ca-950a-07746f51eca9" (UID: "71ef45b1-9ff2-40ca-950a-07746f51eca9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.924045 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ef45b1-9ff2-40ca-950a-07746f51eca9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:23 crc kubenswrapper[4721]: I0202 13:23:23.951331 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ef45b1-9ff2-40ca-950a-07746f51eca9-kube-api-access-rvhpv" (OuterVolumeSpecName: "kube-api-access-rvhpv") pod "71ef45b1-9ff2-40ca-950a-07746f51eca9" (UID: "71ef45b1-9ff2-40ca-950a-07746f51eca9"). InnerVolumeSpecName "kube-api-access-rvhpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:24 crc kubenswrapper[4721]: I0202 13:23:24.025665 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvhpv\" (UniqueName: \"kubernetes.io/projected/71ef45b1-9ff2-40ca-950a-07746f51eca9-kube-api-access-rvhpv\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:24 crc kubenswrapper[4721]: I0202 13:23:24.433235 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cb85a8-831b-4a92-936b-d79276a2d1e5" path="/var/lib/kubelet/pods/42cb85a8-831b-4a92-936b-d79276a2d1e5/volumes" Feb 02 13:23:24 crc kubenswrapper[4721]: I0202 13:23:24.705910 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 13:23:24 crc kubenswrapper[4721]: I0202 13:23:24.713665 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 13:23:24 crc kubenswrapper[4721]: I0202 13:23:24.844316 4721 generic.go:334] "Generic (PLEG): container finished" podID="0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" containerID="08a73cbae26287f30a607e9d3bb9b367097d0316dac19c52bb303a922febd87c" exitCode=0 Feb 02 13:23:24 crc kubenswrapper[4721]: I0202 13:23:24.844500 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h675n" Feb 02 13:23:24 crc kubenswrapper[4721]: I0202 13:23:24.845052 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hrqtc" event={"ID":"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d","Type":"ContainerDied","Data":"08a73cbae26287f30a607e9d3bb9b367097d0316dac19c52bb303a922febd87c"} Feb 02 13:23:24 crc kubenswrapper[4721]: I0202 13:23:24.861431 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 13:23:27 crc kubenswrapper[4721]: I0202 13:23:27.890194 4721 generic.go:334] "Generic (PLEG): container finished" podID="af567124-fd0c-420e-b79b-41e8a7140cef" containerID="59e47087e25d7a69cc9b0e24b51c0193c1d130de3a6fbb82bf929574bc9e38b6" exitCode=0 Feb 02 13:23:27 crc kubenswrapper[4721]: I0202 13:23:27.890305 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l7z2q" event={"ID":"af567124-fd0c-420e-b79b-41e8a7140cef","Type":"ContainerDied","Data":"59e47087e25d7a69cc9b0e24b51c0193c1d130de3a6fbb82bf929574bc9e38b6"} Feb 02 13:23:27 crc kubenswrapper[4721]: I0202 13:23:27.894497 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hrqtc" event={"ID":"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d","Type":"ContainerDied","Data":"bcc0541c75c4c63b75ba7d9003f7cf6d54e2e725d204aed88938ca8246ddf26a"} Feb 02 13:23:27 crc kubenswrapper[4721]: I0202 13:23:27.894550 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcc0541c75c4c63b75ba7d9003f7cf6d54e2e725d204aed88938ca8246ddf26a" Feb 02 13:23:27 crc kubenswrapper[4721]: I0202 13:23:27.908462 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hrqtc" Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.025320 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-combined-ca-bundle\") pod \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.025401 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-config-data\") pod \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.025508 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-db-sync-config-data\") pod \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.025584 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5rlj\" (UniqueName: \"kubernetes.io/projected/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-kube-api-access-s5rlj\") pod \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\" (UID: \"0531b398-2d44-42c2-bd6c-9e9f7ab8c85d\") " Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.030798 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" (UID: "0531b398-2d44-42c2-bd6c-9e9f7ab8c85d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.031409 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-kube-api-access-s5rlj" (OuterVolumeSpecName: "kube-api-access-s5rlj") pod "0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" (UID: "0531b398-2d44-42c2-bd6c-9e9f7ab8c85d"). InnerVolumeSpecName "kube-api-access-s5rlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.058102 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" (UID: "0531b398-2d44-42c2-bd6c-9e9f7ab8c85d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.087649 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-config-data" (OuterVolumeSpecName: "config-data") pod "0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" (UID: "0531b398-2d44-42c2-bd6c-9e9f7ab8c85d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.129187 4721 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.129233 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5rlj\" (UniqueName: \"kubernetes.io/projected/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-kube-api-access-s5rlj\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.129248 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.129260 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:28 crc kubenswrapper[4721]: I0202 13:23:28.909630 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hrqtc" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.403738 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f8lxs"] Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.403960 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="dnsmasq-dns" containerID="cri-o://bc03b54bbb02a10feb6c590756e051033c53cca1bb5f2262c3ad8498b1f79580" gracePeriod=10 Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.406262 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.506620 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-f9ml5"] Feb 02 13:23:29 crc kubenswrapper[4721]: E0202 13:23:29.507244 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" containerName="glance-db-sync" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.507265 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" containerName="glance-db-sync" Feb 02 13:23:29 crc kubenswrapper[4721]: E0202 13:23:29.507295 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cb85a8-831b-4a92-936b-d79276a2d1e5" containerName="init" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.507303 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cb85a8-831b-4a92-936b-d79276a2d1e5" containerName="init" Feb 02 13:23:29 crc kubenswrapper[4721]: E0202 13:23:29.507320 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f27ccd0-68e0-47da-a813-83684a0b1787" containerName="init" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.507328 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f27ccd0-68e0-47da-a813-83684a0b1787" containerName="init" Feb 02 13:23:29 crc kubenswrapper[4721]: E0202 13:23:29.507344 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f27ccd0-68e0-47da-a813-83684a0b1787" containerName="dnsmasq-dns" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.507351 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f27ccd0-68e0-47da-a813-83684a0b1787" containerName="dnsmasq-dns" Feb 02 13:23:29 crc kubenswrapper[4721]: E0202 13:23:29.507366 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ef45b1-9ff2-40ca-950a-07746f51eca9" containerName="mariadb-account-create-update" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.507374 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ef45b1-9ff2-40ca-950a-07746f51eca9" containerName="mariadb-account-create-update" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.507619 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cb85a8-831b-4a92-936b-d79276a2d1e5" containerName="init" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.507647 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ef45b1-9ff2-40ca-950a-07746f51eca9" containerName="mariadb-account-create-update" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.507675 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f27ccd0-68e0-47da-a813-83684a0b1787" containerName="dnsmasq-dns" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.507689 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" containerName="glance-db-sync" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.509026 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.530732 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-f9ml5"] Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.585600 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.585725 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.585753 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.585882 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-config\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.586082 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.586198 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sws9d\" (UniqueName: \"kubernetes.io/projected/a879c878-fff2-4aa4-b08a-67d13027b95e-kube-api-access-sws9d\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.631497 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: connect: connection refused" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.688214 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.688266 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.688359 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-config\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.688447 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.688525 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sws9d\" (UniqueName: \"kubernetes.io/projected/a879c878-fff2-4aa4-b08a-67d13027b95e-kube-api-access-sws9d\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.688620 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.689118 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.689418 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.689740 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.689853 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-config\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.690953 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.747026 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sws9d\" (UniqueName: \"kubernetes.io/projected/a879c878-fff2-4aa4-b08a-67d13027b95e-kube-api-access-sws9d\") pod \"dnsmasq-dns-785d8bcb8c-f9ml5\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.868863 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.929266 4721 generic.go:334] "Generic (PLEG): container finished" podID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerID="bc03b54bbb02a10feb6c590756e051033c53cca1bb5f2262c3ad8498b1f79580" exitCode=0 Feb 02 13:23:29 crc kubenswrapper[4721]: I0202 13:23:29.929307 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" event={"ID":"42af4b6d-a3ac-4a90-8338-71dcdba65713","Type":"ContainerDied","Data":"bc03b54bbb02a10feb6c590756e051033c53cca1bb5f2262c3ad8498b1f79580"} Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.242136 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.244737 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.247116 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.247214 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.247367 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f7kg2" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.258409 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.302580 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.302731 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-logs\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.302765 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.302826 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8rf\" (UniqueName: \"kubernetes.io/projected/231734f6-4050-4aff-92a2-a92982428b95-kube-api-access-tw8rf\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.302855 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.302896 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-scripts\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.302918 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-config-data\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.405289 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8rf\" (UniqueName: \"kubernetes.io/projected/231734f6-4050-4aff-92a2-a92982428b95-kube-api-access-tw8rf\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.405336 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.405385 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-scripts\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.405412 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-config-data\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.405447 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.405559 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-logs\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.405592 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.406919 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.408682 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-logs\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.410906 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.411371 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-config-data\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.411667 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.411735 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c7812605d9919b226d4340fce797cd8fb18c9c948d1e68864aa7eb7aeecf4816/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.412470 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-scripts\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.434291 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8rf\" (UniqueName: \"kubernetes.io/projected/231734f6-4050-4aff-92a2-a92982428b95-kube-api-access-tw8rf\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.477975 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.665408 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.912667 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.914386 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.916740 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 13:23:30 crc kubenswrapper[4721]: I0202 13:23:30.939314 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.021472 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.021904 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.021997 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.022132 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.022171 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-logs\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.022193 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.022235 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgt6x\" (UniqueName: \"kubernetes.io/projected/2651c902-94b1-4da3-b03f-cd5aee83749b-kube-api-access-jgt6x\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.124822 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.124870 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.124956 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.125060 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.125100 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-logs\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.125115 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.125149 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgt6x\" (UniqueName: \"kubernetes.io/projected/2651c902-94b1-4da3-b03f-cd5aee83749b-kube-api-access-jgt6x\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.125702 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-logs\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.127162 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.132243 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.132286 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f813ebfbde533117d7c5539927c335132efd30cff3e1cb355d78cb9d4c1a927/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.136316 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.138612 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.145505 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgt6x\" (UniqueName: \"kubernetes.io/projected/2651c902-94b1-4da3-b03f-cd5aee83749b-kube-api-access-jgt6x\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.146105 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.181662 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:31 crc kubenswrapper[4721]: I0202 13:23:31.242821 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:23:32 crc kubenswrapper[4721]: I0202 13:23:32.240868 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:23:32 crc kubenswrapper[4721]: I0202 13:23:32.331302 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:23:34 crc kubenswrapper[4721]: I0202 13:23:34.631884 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: connect: connection refused" Feb 02 13:23:39 crc kubenswrapper[4721]: I0202 13:23:39.634952 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: connect: connection refused" Feb 02 13:23:39 crc kubenswrapper[4721]: I0202 13:23:39.635471 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:43 crc kubenswrapper[4721]: I0202 13:23:43.129549 4721 generic.go:334] "Generic (PLEG): container finished" podID="026bbe7a-aec9-40ee-9be3-cdb35054e076" containerID="759c834b6a1cc62188124483c6831d2bab037f76c9aac624de4118e2066fe35a" exitCode=0 Feb 02 13:23:43 crc kubenswrapper[4721]: I0202 13:23:43.129650 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2tnbk" event={"ID":"026bbe7a-aec9-40ee-9be3-cdb35054e076","Type":"ContainerDied","Data":"759c834b6a1cc62188124483c6831d2bab037f76c9aac624de4118e2066fe35a"} Feb 02 13:23:44 crc kubenswrapper[4721]: I0202 13:23:44.631492 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: connect: connection refused" Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.879845 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.955665 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-fernet-keys\") pod \"af567124-fd0c-420e-b79b-41e8a7140cef\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.955794 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzqjs\" (UniqueName: \"kubernetes.io/projected/af567124-fd0c-420e-b79b-41e8a7140cef-kube-api-access-hzqjs\") pod \"af567124-fd0c-420e-b79b-41e8a7140cef\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.955858 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-scripts\") pod \"af567124-fd0c-420e-b79b-41e8a7140cef\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.955940 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-config-data\") pod \"af567124-fd0c-420e-b79b-41e8a7140cef\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.956012 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-credential-keys\") pod \"af567124-fd0c-420e-b79b-41e8a7140cef\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.956212 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-combined-ca-bundle\") pod \"af567124-fd0c-420e-b79b-41e8a7140cef\" (UID: \"af567124-fd0c-420e-b79b-41e8a7140cef\") " Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.964049 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-scripts" (OuterVolumeSpecName: "scripts") pod "af567124-fd0c-420e-b79b-41e8a7140cef" (UID: "af567124-fd0c-420e-b79b-41e8a7140cef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.964130 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "af567124-fd0c-420e-b79b-41e8a7140cef" (UID: "af567124-fd0c-420e-b79b-41e8a7140cef"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.964249 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "af567124-fd0c-420e-b79b-41e8a7140cef" (UID: "af567124-fd0c-420e-b79b-41e8a7140cef"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.965118 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af567124-fd0c-420e-b79b-41e8a7140cef-kube-api-access-hzqjs" (OuterVolumeSpecName: "kube-api-access-hzqjs") pod "af567124-fd0c-420e-b79b-41e8a7140cef" (UID: "af567124-fd0c-420e-b79b-41e8a7140cef"). InnerVolumeSpecName "kube-api-access-hzqjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.984823 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af567124-fd0c-420e-b79b-41e8a7140cef" (UID: "af567124-fd0c-420e-b79b-41e8a7140cef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:47 crc kubenswrapper[4721]: I0202 13:23:47.985840 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-config-data" (OuterVolumeSpecName: "config-data") pod "af567124-fd0c-420e-b79b-41e8a7140cef" (UID: "af567124-fd0c-420e-b79b-41e8a7140cef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:48 crc kubenswrapper[4721]: I0202 13:23:48.059830 4721 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:48 crc kubenswrapper[4721]: I0202 13:23:48.059894 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:48 crc kubenswrapper[4721]: I0202 13:23:48.059907 4721 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:48 crc kubenswrapper[4721]: I0202 13:23:48.059920 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzqjs\" (UniqueName: \"kubernetes.io/projected/af567124-fd0c-420e-b79b-41e8a7140cef-kube-api-access-hzqjs\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:48 crc kubenswrapper[4721]: I0202 13:23:48.059933 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:48 crc kubenswrapper[4721]: I0202 13:23:48.059945 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af567124-fd0c-420e-b79b-41e8a7140cef-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:48 crc kubenswrapper[4721]: I0202 13:23:48.193656 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-l7z2q" event={"ID":"af567124-fd0c-420e-b79b-41e8a7140cef","Type":"ContainerDied","Data":"0b6290d6ebded27a4f316246821ca1ecc86a05d55f3135936f20ff3e7cfb9766"} Feb 02 13:23:48 crc kubenswrapper[4721]: I0202 13:23:48.193741 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b6290d6ebded27a4f316246821ca1ecc86a05d55f3135936f20ff3e7cfb9766" Feb 02 13:23:48 crc kubenswrapper[4721]: I0202 13:23:48.193848 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-l7z2q" Feb 02 13:23:48 crc kubenswrapper[4721]: E0202 13:23:48.534341 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Feb 02 13:23:48 crc kubenswrapper[4721]: E0202 13:23:48.534850 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2ct7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-n52pp_openstack(9fa244a8-7588-4d87-bd5b-cbcd10780c83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:23:48 crc kubenswrapper[4721]: E0202 13:23:48.536572 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-n52pp" podUID="9fa244a8-7588-4d87-bd5b-cbcd10780c83" Feb 02 13:23:48 crc kubenswrapper[4721]: E0202 13:23:48.910435 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 02 13:23:48 crc kubenswrapper[4721]: E0202 13:23:48.910684 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n585hd7h67dhb8h8fh586h5ffhfch5d4hd9h647h545h5fchd4h56bh5b4h9ch5d8h95h67fh555h656hcfhcdhbch5cdhd4hbh564hbbhd7h669q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fbbzr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.007534 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-l7z2q"] Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.016059 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-l7z2q"] Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.027769 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.087203 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-config\") pod \"026bbe7a-aec9-40ee-9be3-cdb35054e076\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.087383 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kszgd\" (UniqueName: \"kubernetes.io/projected/026bbe7a-aec9-40ee-9be3-cdb35054e076-kube-api-access-kszgd\") pod \"026bbe7a-aec9-40ee-9be3-cdb35054e076\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.087518 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-combined-ca-bundle\") pod \"026bbe7a-aec9-40ee-9be3-cdb35054e076\" (UID: \"026bbe7a-aec9-40ee-9be3-cdb35054e076\") " Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.102406 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dw7nl"] Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.102684 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026bbe7a-aec9-40ee-9be3-cdb35054e076-kube-api-access-kszgd" (OuterVolumeSpecName: "kube-api-access-kszgd") pod "026bbe7a-aec9-40ee-9be3-cdb35054e076" (UID: "026bbe7a-aec9-40ee-9be3-cdb35054e076"). InnerVolumeSpecName "kube-api-access-kszgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:49 crc kubenswrapper[4721]: E0202 13:23:49.103114 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af567124-fd0c-420e-b79b-41e8a7140cef" containerName="keystone-bootstrap" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.103267 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="af567124-fd0c-420e-b79b-41e8a7140cef" containerName="keystone-bootstrap" Feb 02 13:23:49 crc kubenswrapper[4721]: E0202 13:23:49.103344 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="026bbe7a-aec9-40ee-9be3-cdb35054e076" containerName="neutron-db-sync" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.103397 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="026bbe7a-aec9-40ee-9be3-cdb35054e076" containerName="neutron-db-sync" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.103657 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="af567124-fd0c-420e-b79b-41e8a7140cef" containerName="keystone-bootstrap" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.103722 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="026bbe7a-aec9-40ee-9be3-cdb35054e076" containerName="neutron-db-sync" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.104612 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.107222 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.107968 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.108520 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.112418 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.113029 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7z9s7" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.133158 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dw7nl"] Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.134191 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "026bbe7a-aec9-40ee-9be3-cdb35054e076" (UID: "026bbe7a-aec9-40ee-9be3-cdb35054e076"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.156253 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-config" (OuterVolumeSpecName: "config") pod "026bbe7a-aec9-40ee-9be3-cdb35054e076" (UID: "026bbe7a-aec9-40ee-9be3-cdb35054e076"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.192512 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-scripts\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.193372 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-fernet-keys\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.193566 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzdbz\" (UniqueName: \"kubernetes.io/projected/d168e414-ab7e-45ad-b142-25dcc1c359b0-kube-api-access-bzdbz\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.193873 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-combined-ca-bundle\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.196600 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-config-data\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.196799 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-credential-keys\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.197090 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.197164 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/026bbe7a-aec9-40ee-9be3-cdb35054e076-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.197250 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kszgd\" (UniqueName: \"kubernetes.io/projected/026bbe7a-aec9-40ee-9be3-cdb35054e076-kube-api-access-kszgd\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.212171 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-2tnbk" event={"ID":"026bbe7a-aec9-40ee-9be3-cdb35054e076","Type":"ContainerDied","Data":"3095e5ccc72506309a2adf6e3ceae18778d2c5f94dff5364525934f800f177aa"} Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.212212 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-2tnbk" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.212236 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3095e5ccc72506309a2adf6e3ceae18778d2c5f94dff5364525934f800f177aa" Feb 02 13:23:49 crc kubenswrapper[4721]: E0202 13:23:49.228968 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-n52pp" podUID="9fa244a8-7588-4d87-bd5b-cbcd10780c83" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.300395 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-config-data\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.300614 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-credential-keys\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.301162 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-scripts\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.301360 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-fernet-keys\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.301493 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzdbz\" (UniqueName: \"kubernetes.io/projected/d168e414-ab7e-45ad-b142-25dcc1c359b0-kube-api-access-bzdbz\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.301859 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-combined-ca-bundle\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.313259 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-credential-keys\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.313618 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-config-data\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.315419 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-scripts\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.318057 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-fernet-keys\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.318238 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-combined-ca-bundle\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.321511 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzdbz\" (UniqueName: \"kubernetes.io/projected/d168e414-ab7e-45ad-b142-25dcc1c359b0-kube-api-access-bzdbz\") pod \"keystone-bootstrap-dw7nl\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: I0202 13:23:49.575903 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:23:49 crc kubenswrapper[4721]: E0202 13:23:49.719048 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 02 13:23:49 crc kubenswrapper[4721]: E0202 13:23:49.719369 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-777ht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-cgqfl_openstack(47a4176b-5f58-47a9-a614-e5d05526da18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:23:49 crc kubenswrapper[4721]: E0202 13:23:49.720658 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-cgqfl" podUID="47a4176b-5f58-47a9-a614-e5d05526da18" Feb 02 13:23:50 crc kubenswrapper[4721]: E0202 13:23:50.247181 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-cgqfl" podUID="47a4176b-5f58-47a9-a614-e5d05526da18" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.295900 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-f9ml5"] Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.365792 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnq9d"] Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.367559 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.398129 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnq9d"] Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.438584 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.438676 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.438718 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.438796 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcdcl\" (UniqueName: \"kubernetes.io/projected/4b323e62-7a54-4935-8e47-2df809ecb2f9-kube-api-access-gcdcl\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.438826 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-config\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.438860 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.522317 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af567124-fd0c-420e-b79b-41e8a7140cef" path="/var/lib/kubelet/pods/af567124-fd0c-420e-b79b-41e8a7140cef/volumes" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.523044 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7556fd87fb-z78lc"] Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.536632 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7556fd87fb-z78lc"] Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.536721 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.539201 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.540668 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.540725 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.540809 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcdcl\" (UniqueName: \"kubernetes.io/projected/4b323e62-7a54-4935-8e47-2df809ecb2f9-kube-api-access-gcdcl\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.540838 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-config\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.540877 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.540918 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.541911 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.542903 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.543672 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.543703 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-config\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.544095 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.544639 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4l6jw" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.544803 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.544984 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.611488 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcdcl\" (UniqueName: \"kubernetes.io/projected/4b323e62-7a54-4935-8e47-2df809ecb2f9-kube-api-access-gcdcl\") pod \"dnsmasq-dns-55f844cf75-mnq9d\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.650107 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-ovndb-tls-certs\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.650184 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-combined-ca-bundle\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.650229 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw65p\" (UniqueName: \"kubernetes.io/projected/6f746721-5da3-4418-8ef6-d0b88f2121bc-kube-api-access-jw65p\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.650286 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-config\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.650322 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-httpd-config\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.752718 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw65p\" (UniqueName: \"kubernetes.io/projected/6f746721-5da3-4418-8ef6-d0b88f2121bc-kube-api-access-jw65p\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.752812 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-config\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.752854 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-httpd-config\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.753095 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-ovndb-tls-certs\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.753152 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-combined-ca-bundle\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.757909 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.758010 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-combined-ca-bundle\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.758791 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-ovndb-tls-certs\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.758807 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-httpd-config\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.760316 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-config\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.777418 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw65p\" (UniqueName: \"kubernetes.io/projected/6f746721-5da3-4418-8ef6-d0b88f2121bc-kube-api-access-jw65p\") pod \"neutron-7556fd87fb-z78lc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:50 crc kubenswrapper[4721]: I0202 13:23:50.855349 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:51 crc kubenswrapper[4721]: E0202 13:23:51.749146 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 02 13:23:51 crc kubenswrapper[4721]: E0202 13:23:51.749392 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c4b6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-7wjxh_openstack(ad3578ef-5d1b-4c52-939c-237feadc1c5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 13:23:51 crc kubenswrapper[4721]: E0202 13:23:51.750592 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-7wjxh" podUID="ad3578ef-5d1b-4c52-939c-237feadc1c5c" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.244456 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.310195 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.311515 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" event={"ID":"42af4b6d-a3ac-4a90-8338-71dcdba65713","Type":"ContainerDied","Data":"456048e529dbcb77018460d463790172fa02060ea250e5dff1aa993f12b0810c"} Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.311592 4721 scope.go:117] "RemoveContainer" containerID="bc03b54bbb02a10feb6c590756e051033c53cca1bb5f2262c3ad8498b1f79580" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.315573 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t64pt\" (UniqueName: \"kubernetes.io/projected/42af4b6d-a3ac-4a90-8338-71dcdba65713-kube-api-access-t64pt\") pod \"42af4b6d-a3ac-4a90-8338-71dcdba65713\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.315851 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-svc\") pod \"42af4b6d-a3ac-4a90-8338-71dcdba65713\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.317186 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-config\") pod \"42af4b6d-a3ac-4a90-8338-71dcdba65713\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.317246 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-swift-storage-0\") pod \"42af4b6d-a3ac-4a90-8338-71dcdba65713\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.317276 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-sb\") pod \"42af4b6d-a3ac-4a90-8338-71dcdba65713\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.317344 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-nb\") pod \"42af4b6d-a3ac-4a90-8338-71dcdba65713\" (UID: \"42af4b6d-a3ac-4a90-8338-71dcdba65713\") " Feb 02 13:23:52 crc kubenswrapper[4721]: E0202 13:23:52.317979 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-7wjxh" podUID="ad3578ef-5d1b-4c52-939c-237feadc1c5c" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.354674 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42af4b6d-a3ac-4a90-8338-71dcdba65713-kube-api-access-t64pt" (OuterVolumeSpecName: "kube-api-access-t64pt") pod "42af4b6d-a3ac-4a90-8338-71dcdba65713" (UID: "42af4b6d-a3ac-4a90-8338-71dcdba65713"). InnerVolumeSpecName "kube-api-access-t64pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.366666 4721 scope.go:117] "RemoveContainer" containerID="5e753861212a75a6fc0b9472a29c53a493e744b8cbd4bb8e6b1dda52762f0e28" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.426361 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t64pt\" (UniqueName: \"kubernetes.io/projected/42af4b6d-a3ac-4a90-8338-71dcdba65713-kube-api-access-t64pt\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.431084 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42af4b6d-a3ac-4a90-8338-71dcdba65713" (UID: "42af4b6d-a3ac-4a90-8338-71dcdba65713"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.454778 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42af4b6d-a3ac-4a90-8338-71dcdba65713" (UID: "42af4b6d-a3ac-4a90-8338-71dcdba65713"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.458506 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42af4b6d-a3ac-4a90-8338-71dcdba65713" (UID: "42af4b6d-a3ac-4a90-8338-71dcdba65713"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.461953 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-config" (OuterVolumeSpecName: "config") pod "42af4b6d-a3ac-4a90-8338-71dcdba65713" (UID: "42af4b6d-a3ac-4a90-8338-71dcdba65713"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.462880 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "42af4b6d-a3ac-4a90-8338-71dcdba65713" (UID: "42af4b6d-a3ac-4a90-8338-71dcdba65713"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.528882 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.528920 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.528931 4721 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.528947 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.528959 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42af4b6d-a3ac-4a90-8338-71dcdba65713-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.663106 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f8lxs"] Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.679201 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-f8lxs"] Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.877254 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-f9ml5"] Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.897404 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-59d9f7977f-7dt9k"] Feb 02 13:23:52 crc kubenswrapper[4721]: E0202 13:23:52.897934 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="dnsmasq-dns" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.897950 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="dnsmasq-dns" Feb 02 13:23:52 crc kubenswrapper[4721]: E0202 13:23:52.898005 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="init" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.898015 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="init" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.898311 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="dnsmasq-dns" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.904446 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.910021 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.922258 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59d9f7977f-7dt9k"] Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.923222 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.932399 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dw7nl"] Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.944292 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-internal-tls-certs\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.944660 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zxkh\" (UniqueName: \"kubernetes.io/projected/92544741-12fa-42ac-ba5b-67179ec9443b-kube-api-access-9zxkh\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.944740 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-public-tls-certs\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.944807 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-combined-ca-bundle\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.944876 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-config\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.944952 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-ovndb-tls-certs\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.944985 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-httpd-config\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:52 crc kubenswrapper[4721]: I0202 13:23:52.999060 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.047826 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zxkh\" (UniqueName: \"kubernetes.io/projected/92544741-12fa-42ac-ba5b-67179ec9443b-kube-api-access-9zxkh\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.047910 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-public-tls-certs\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.047964 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-combined-ca-bundle\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.048859 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-config\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.048923 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-ovndb-tls-certs\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.048968 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-httpd-config\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.049160 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-internal-tls-certs\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.056790 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnq9d"] Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.062884 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-public-tls-certs\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.065543 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-config\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.065644 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-combined-ca-bundle\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.066148 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-internal-tls-certs\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.073305 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-ovndb-tls-certs\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.079775 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-httpd-config\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.090273 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zxkh\" (UniqueName: \"kubernetes.io/projected/92544741-12fa-42ac-ba5b-67179ec9443b-kube-api-access-9zxkh\") pod \"neutron-59d9f7977f-7dt9k\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: W0202 13:23:53.126688 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b323e62_7a54_4935_8e47_2df809ecb2f9.slice/crio-2496bbb6bf1b14291e279777fda509f8bd82ead08412e32b9ec7b5156bb292a2 WatchSource:0}: Error finding container 2496bbb6bf1b14291e279777fda509f8bd82ead08412e32b9ec7b5156bb292a2: Status 404 returned error can't find the container with id 2496bbb6bf1b14291e279777fda509f8bd82ead08412e32b9ec7b5156bb292a2 Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.231867 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.321790 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-86z2v" event={"ID":"bdd67c16-7130-4095-952f-006aa5bcd5bb","Type":"ContainerStarted","Data":"e7d18b6bc119712cab584a04197b6912f210c5505b0b18fd032278cec2f8f3b5"} Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.325492 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"231734f6-4050-4aff-92a2-a92982428b95","Type":"ContainerStarted","Data":"a421eaea63c1491fb34aa76f2530e7a4cae6ab0162bb28e14e6199cf5d6daf85"} Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.349737 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-86z2v" podStartSLOduration=3.956863726 podStartE2EDuration="34.349715038s" podCreationTimestamp="2026-02-02 13:23:19 +0000 UTC" firstStartedPulling="2026-02-02 13:23:21.315675566 +0000 UTC m=+1341.618189955" lastFinishedPulling="2026-02-02 13:23:51.708526878 +0000 UTC m=+1372.011041267" observedRunningTime="2026-02-02 13:23:53.342525123 +0000 UTC m=+1373.645039512" watchObservedRunningTime="2026-02-02 13:23:53.349715038 +0000 UTC m=+1373.652229447" Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.354122 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" event={"ID":"4b323e62-7a54-4935-8e47-2df809ecb2f9","Type":"ContainerStarted","Data":"2496bbb6bf1b14291e279777fda509f8bd82ead08412e32b9ec7b5156bb292a2"} Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.360541 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" event={"ID":"a879c878-fff2-4aa4-b08a-67d13027b95e","Type":"ContainerStarted","Data":"e8f80343ecb3d990b67219a5c3260755b6d6ddca2e0665007891b6a3ff94aed6"} Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.361802 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dw7nl" event={"ID":"d168e414-ab7e-45ad-b142-25dcc1c359b0","Type":"ContainerStarted","Data":"b5e597a9c6809a0c74e15c50feaa328c23d1128cb144bc36612339923f03dd73"} Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.744163 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7556fd87fb-z78lc"] Feb 02 13:23:53 crc kubenswrapper[4721]: I0202 13:23:53.804782 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.234815 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59d9f7977f-7dt9k"] Feb 02 13:23:54 crc kubenswrapper[4721]: W0202 13:23:54.248864 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92544741_12fa_42ac_ba5b_67179ec9443b.slice/crio-f6156b432d02a8811d9c46ced7a52980aa20788c34415c25f1c59b54dac366c9 WatchSource:0}: Error finding container f6156b432d02a8811d9c46ced7a52980aa20788c34415c25f1c59b54dac366c9: Status 404 returned error can't find the container with id f6156b432d02a8811d9c46ced7a52980aa20788c34415c25f1c59b54dac366c9 Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.379644 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7556fd87fb-z78lc" event={"ID":"6f746721-5da3-4418-8ef6-d0b88f2121bc","Type":"ContainerStarted","Data":"4badbbeb19bd1ce4dabd3dbcb1949a66c97f449e7bcb0e940791d3782f7337b1"} Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.380017 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7556fd87fb-z78lc" event={"ID":"6f746721-5da3-4418-8ef6-d0b88f2121bc","Type":"ContainerStarted","Data":"8cc4a5e49bcdc1259392f527ba7a63bedab94aac24105814e1cdaa17c7280e6e"} Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.382045 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5","Type":"ContainerStarted","Data":"532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae"} Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.386630 4721 generic.go:334] "Generic (PLEG): container finished" podID="4b323e62-7a54-4935-8e47-2df809ecb2f9" containerID="ece0ac3461ccc45508567010797bf0f995740c80a06e3f1018effd1949e58be1" exitCode=0 Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.386760 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" event={"ID":"4b323e62-7a54-4935-8e47-2df809ecb2f9","Type":"ContainerDied","Data":"ece0ac3461ccc45508567010797bf0f995740c80a06e3f1018effd1949e58be1"} Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.398772 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d9f7977f-7dt9k" event={"ID":"92544741-12fa-42ac-ba5b-67179ec9443b","Type":"ContainerStarted","Data":"f6156b432d02a8811d9c46ced7a52980aa20788c34415c25f1c59b54dac366c9"} Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.423294 4721 generic.go:334] "Generic (PLEG): container finished" podID="a879c878-fff2-4aa4-b08a-67d13027b95e" containerID="474f0b3dd5fc3d829759613af04141efbfcae7faec967567689e0cac24ad4e8d" exitCode=0 Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.444684 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" path="/var/lib/kubelet/pods/42af4b6d-a3ac-4a90-8338-71dcdba65713/volumes" Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.445329 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"231734f6-4050-4aff-92a2-a92982428b95","Type":"ContainerStarted","Data":"4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2"} Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.445366 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" event={"ID":"a879c878-fff2-4aa4-b08a-67d13027b95e","Type":"ContainerDied","Data":"474f0b3dd5fc3d829759613af04141efbfcae7faec967567689e0cac24ad4e8d"} Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.445392 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2651c902-94b1-4da3-b03f-cd5aee83749b","Type":"ContainerStarted","Data":"de0fa32e0591d0c591d4364744d36cc03bd323f3bbad2d10abfa57cf3278568e"} Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.453388 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dw7nl" event={"ID":"d168e414-ab7e-45ad-b142-25dcc1c359b0","Type":"ContainerStarted","Data":"3705d645077158cd12edf8f0f9b5a39f0ba95d5854f57c056a964be7f2bc24c9"} Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.489353 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dw7nl" podStartSLOduration=5.489330525 podStartE2EDuration="5.489330525s" podCreationTimestamp="2026-02-02 13:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:54.476730313 +0000 UTC m=+1374.779244712" watchObservedRunningTime="2026-02-02 13:23:54.489330525 +0000 UTC m=+1374.791844924" Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.630868 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-f8lxs" podUID="42af4b6d-a3ac-4a90-8338-71dcdba65713" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.187:5353: i/o timeout" Feb 02 13:23:54 crc kubenswrapper[4721]: I0202 13:23:54.978039 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.039580 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-svc\") pod \"a879c878-fff2-4aa4-b08a-67d13027b95e\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.039674 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sws9d\" (UniqueName: \"kubernetes.io/projected/a879c878-fff2-4aa4-b08a-67d13027b95e-kube-api-access-sws9d\") pod \"a879c878-fff2-4aa4-b08a-67d13027b95e\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.039728 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-swift-storage-0\") pod \"a879c878-fff2-4aa4-b08a-67d13027b95e\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.039742 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-nb\") pod \"a879c878-fff2-4aa4-b08a-67d13027b95e\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.039811 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-sb\") pod \"a879c878-fff2-4aa4-b08a-67d13027b95e\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.039935 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-config\") pod \"a879c878-fff2-4aa4-b08a-67d13027b95e\" (UID: \"a879c878-fff2-4aa4-b08a-67d13027b95e\") " Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.076255 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a879c878-fff2-4aa4-b08a-67d13027b95e-kube-api-access-sws9d" (OuterVolumeSpecName: "kube-api-access-sws9d") pod "a879c878-fff2-4aa4-b08a-67d13027b95e" (UID: "a879c878-fff2-4aa4-b08a-67d13027b95e"). InnerVolumeSpecName "kube-api-access-sws9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.111719 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a879c878-fff2-4aa4-b08a-67d13027b95e" (UID: "a879c878-fff2-4aa4-b08a-67d13027b95e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.120630 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a879c878-fff2-4aa4-b08a-67d13027b95e" (UID: "a879c878-fff2-4aa4-b08a-67d13027b95e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.125788 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a879c878-fff2-4aa4-b08a-67d13027b95e" (UID: "a879c878-fff2-4aa4-b08a-67d13027b95e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.145735 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.145766 4721 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.145777 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.145786 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sws9d\" (UniqueName: \"kubernetes.io/projected/a879c878-fff2-4aa4-b08a-67d13027b95e-kube-api-access-sws9d\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.145971 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a879c878-fff2-4aa4-b08a-67d13027b95e" (UID: "a879c878-fff2-4aa4-b08a-67d13027b95e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.151787 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-config" (OuterVolumeSpecName: "config") pod "a879c878-fff2-4aa4-b08a-67d13027b95e" (UID: "a879c878-fff2-4aa4-b08a-67d13027b95e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.252445 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.252479 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a879c878-fff2-4aa4-b08a-67d13027b95e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:55 crc kubenswrapper[4721]: E0202 13:23:55.454719 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdd67c16_7130_4095_952f_006aa5bcd5bb.slice/crio-conmon-e7d18b6bc119712cab584a04197b6912f210c5505b0b18fd032278cec2f8f3b5.scope\": RecentStats: unable to find data in memory cache]" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.474299 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d9f7977f-7dt9k" event={"ID":"92544741-12fa-42ac-ba5b-67179ec9443b","Type":"ContainerStarted","Data":"8b6cdf8693a8045891752ca1818e301a94afd2ff81089e5e3f78d2e62be53c12"} Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.474340 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d9f7977f-7dt9k" event={"ID":"92544741-12fa-42ac-ba5b-67179ec9443b","Type":"ContainerStarted","Data":"da489fd3d8eceb37f5d84a7cb93a4298ee1e4fc025f5a8e61c61624f47e5cc8a"} Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.475245 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.478358 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"231734f6-4050-4aff-92a2-a92982428b95","Type":"ContainerStarted","Data":"b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329"} Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.478463 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="231734f6-4050-4aff-92a2-a92982428b95" containerName="glance-log" containerID="cri-o://4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2" gracePeriod=30 Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.478484 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="231734f6-4050-4aff-92a2-a92982428b95" containerName="glance-httpd" containerID="cri-o://b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329" gracePeriod=30 Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.482263 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" event={"ID":"a879c878-fff2-4aa4-b08a-67d13027b95e","Type":"ContainerDied","Data":"e8f80343ecb3d990b67219a5c3260755b6d6ddca2e0665007891b6a3ff94aed6"} Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.482289 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-f9ml5" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.482313 4721 scope.go:117] "RemoveContainer" containerID="474f0b3dd5fc3d829759613af04141efbfcae7faec967567689e0cac24ad4e8d" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.485473 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2651c902-94b1-4da3-b03f-cd5aee83749b","Type":"ContainerStarted","Data":"9b2f14ad59791ba7c4b7509a93d19935150acbbd21de442df3a99c78406b90d5"} Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.488333 4721 generic.go:334] "Generic (PLEG): container finished" podID="bdd67c16-7130-4095-952f-006aa5bcd5bb" containerID="e7d18b6bc119712cab584a04197b6912f210c5505b0b18fd032278cec2f8f3b5" exitCode=0 Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.488396 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-86z2v" event={"ID":"bdd67c16-7130-4095-952f-006aa5bcd5bb","Type":"ContainerDied","Data":"e7d18b6bc119712cab584a04197b6912f210c5505b0b18fd032278cec2f8f3b5"} Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.506096 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-59d9f7977f-7dt9k" podStartSLOduration=3.506040219 podStartE2EDuration="3.506040219s" podCreationTimestamp="2026-02-02 13:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:55.495207525 +0000 UTC m=+1375.797721914" watchObservedRunningTime="2026-02-02 13:23:55.506040219 +0000 UTC m=+1375.808554618" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.536880 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7556fd87fb-z78lc" event={"ID":"6f746721-5da3-4418-8ef6-d0b88f2121bc","Type":"ContainerStarted","Data":"8f88f40e370779b33eeb9ef5e263779063066216a67b07cb99c3e5b293638a3d"} Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.537109 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.548525 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" event={"ID":"4b323e62-7a54-4935-8e47-2df809ecb2f9","Type":"ContainerStarted","Data":"38beb737c04426dba147dffdfeaee7f338c6927e4b1cdcf880429b179ca9988b"} Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.549214 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.558763 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=26.558717518 podStartE2EDuration="26.558717518s" podCreationTimestamp="2026-02-02 13:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:55.545041507 +0000 UTC m=+1375.847555896" watchObservedRunningTime="2026-02-02 13:23:55.558717518 +0000 UTC m=+1375.861231917" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.619370 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-f9ml5"] Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.630095 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-f9ml5"] Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.636649 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7556fd87fb-z78lc" podStartSLOduration=5.636632371 podStartE2EDuration="5.636632371s" podCreationTimestamp="2026-02-02 13:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:55.601266492 +0000 UTC m=+1375.903780881" watchObservedRunningTime="2026-02-02 13:23:55.636632371 +0000 UTC m=+1375.939146760" Feb 02 13:23:55 crc kubenswrapper[4721]: I0202 13:23:55.649878 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" podStartSLOduration=5.64985898 podStartE2EDuration="5.64985898s" podCreationTimestamp="2026-02-02 13:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:55.630767552 +0000 UTC m=+1375.933281941" watchObservedRunningTime="2026-02-02 13:23:55.64985898 +0000 UTC m=+1375.952373369" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.255988 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.396671 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"231734f6-4050-4aff-92a2-a92982428b95\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.396749 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw8rf\" (UniqueName: \"kubernetes.io/projected/231734f6-4050-4aff-92a2-a92982428b95-kube-api-access-tw8rf\") pod \"231734f6-4050-4aff-92a2-a92982428b95\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.396788 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-httpd-run\") pod \"231734f6-4050-4aff-92a2-a92982428b95\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.396923 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-logs\") pod \"231734f6-4050-4aff-92a2-a92982428b95\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.396956 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-scripts\") pod \"231734f6-4050-4aff-92a2-a92982428b95\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.397005 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-combined-ca-bundle\") pod \"231734f6-4050-4aff-92a2-a92982428b95\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.397100 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-config-data\") pod \"231734f6-4050-4aff-92a2-a92982428b95\" (UID: \"231734f6-4050-4aff-92a2-a92982428b95\") " Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.397470 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-logs" (OuterVolumeSpecName: "logs") pod "231734f6-4050-4aff-92a2-a92982428b95" (UID: "231734f6-4050-4aff-92a2-a92982428b95"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.397756 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "231734f6-4050-4aff-92a2-a92982428b95" (UID: "231734f6-4050-4aff-92a2-a92982428b95"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.397988 4721 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.398000 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/231734f6-4050-4aff-92a2-a92982428b95-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.407878 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231734f6-4050-4aff-92a2-a92982428b95-kube-api-access-tw8rf" (OuterVolumeSpecName: "kube-api-access-tw8rf") pod "231734f6-4050-4aff-92a2-a92982428b95" (UID: "231734f6-4050-4aff-92a2-a92982428b95"). InnerVolumeSpecName "kube-api-access-tw8rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.411814 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-scripts" (OuterVolumeSpecName: "scripts") pod "231734f6-4050-4aff-92a2-a92982428b95" (UID: "231734f6-4050-4aff-92a2-a92982428b95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.428046 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a879c878-fff2-4aa4-b08a-67d13027b95e" path="/var/lib/kubelet/pods/a879c878-fff2-4aa4-b08a-67d13027b95e/volumes" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.428523 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663" (OuterVolumeSpecName: "glance") pod "231734f6-4050-4aff-92a2-a92982428b95" (UID: "231734f6-4050-4aff-92a2-a92982428b95"). InnerVolumeSpecName "pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.467236 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "231734f6-4050-4aff-92a2-a92982428b95" (UID: "231734f6-4050-4aff-92a2-a92982428b95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.483268 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-config-data" (OuterVolumeSpecName: "config-data") pod "231734f6-4050-4aff-92a2-a92982428b95" (UID: "231734f6-4050-4aff-92a2-a92982428b95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.500081 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw8rf\" (UniqueName: \"kubernetes.io/projected/231734f6-4050-4aff-92a2-a92982428b95-kube-api-access-tw8rf\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.500424 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.500516 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.500599 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/231734f6-4050-4aff-92a2-a92982428b95-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.500697 4721 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") on node \"crc\" " Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.531558 4721 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.532099 4721 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663") on node "crc" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.595793 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2651c902-94b1-4da3-b03f-cd5aee83749b","Type":"ContainerStarted","Data":"6863179c724438e853ded7b834d2693a1bdec25fbe0b4d4e3ec7a6bb5320d8bb"} Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.595976 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerName="glance-log" containerID="cri-o://9b2f14ad59791ba7c4b7509a93d19935150acbbd21de442df3a99c78406b90d5" gracePeriod=30 Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.597138 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerName="glance-httpd" containerID="cri-o://6863179c724438e853ded7b834d2693a1bdec25fbe0b4d4e3ec7a6bb5320d8bb" gracePeriod=30 Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.606158 4721 reconciler_common.go:293] "Volume detached for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.616838 4721 generic.go:334] "Generic (PLEG): container finished" podID="231734f6-4050-4aff-92a2-a92982428b95" containerID="b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329" exitCode=143 Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.616878 4721 generic.go:334] "Generic (PLEG): container finished" podID="231734f6-4050-4aff-92a2-a92982428b95" containerID="4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2" exitCode=143 Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.619509 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"231734f6-4050-4aff-92a2-a92982428b95","Type":"ContainerDied","Data":"b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329"} Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.619571 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"231734f6-4050-4aff-92a2-a92982428b95","Type":"ContainerDied","Data":"4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2"} Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.619584 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"231734f6-4050-4aff-92a2-a92982428b95","Type":"ContainerDied","Data":"a421eaea63c1491fb34aa76f2530e7a4cae6ab0162bb28e14e6199cf5d6daf85"} Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.619602 4721 scope.go:117] "RemoveContainer" containerID="b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.619857 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.633004 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=27.632978523 podStartE2EDuration="27.632978523s" podCreationTimestamp="2026-02-02 13:23:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:23:56.619478236 +0000 UTC m=+1376.921992625" watchObservedRunningTime="2026-02-02 13:23:56.632978523 +0000 UTC m=+1376.935492912" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.682793 4721 scope.go:117] "RemoveContainer" containerID="4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.682975 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.705137 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.735250 4721 scope.go:117] "RemoveContainer" containerID="b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.735364 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:23:56 crc kubenswrapper[4721]: E0202 13:23:56.735899 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231734f6-4050-4aff-92a2-a92982428b95" containerName="glance-log" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.735911 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="231734f6-4050-4aff-92a2-a92982428b95" containerName="glance-log" Feb 02 13:23:56 crc kubenswrapper[4721]: E0202 13:23:56.735922 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a879c878-fff2-4aa4-b08a-67d13027b95e" containerName="init" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.735927 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a879c878-fff2-4aa4-b08a-67d13027b95e" containerName="init" Feb 02 13:23:56 crc kubenswrapper[4721]: E0202 13:23:56.735945 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231734f6-4050-4aff-92a2-a92982428b95" containerName="glance-httpd" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.735950 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="231734f6-4050-4aff-92a2-a92982428b95" containerName="glance-httpd" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.736150 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a879c878-fff2-4aa4-b08a-67d13027b95e" containerName="init" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.736167 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="231734f6-4050-4aff-92a2-a92982428b95" containerName="glance-httpd" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.736184 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="231734f6-4050-4aff-92a2-a92982428b95" containerName="glance-log" Feb 02 13:23:56 crc kubenswrapper[4721]: E0202 13:23:56.736513 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329\": container with ID starting with b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329 not found: ID does not exist" containerID="b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.737388 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.737430 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329"} err="failed to get container status \"b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329\": rpc error: code = NotFound desc = could not find container \"b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329\": container with ID starting with b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329 not found: ID does not exist" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.738730 4721 scope.go:117] "RemoveContainer" containerID="4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.739968 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 13:23:56 crc kubenswrapper[4721]: E0202 13:23:56.740186 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2\": container with ID starting with 4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2 not found: ID does not exist" containerID="4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.740213 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2"} err="failed to get container status \"4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2\": rpc error: code = NotFound desc = could not find container \"4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2\": container with ID starting with 4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2 not found: ID does not exist" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.740237 4721 scope.go:117] "RemoveContainer" containerID="b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.740377 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.740715 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329"} err="failed to get container status \"b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329\": rpc error: code = NotFound desc = could not find container \"b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329\": container with ID starting with b3242cfa6b241cc91e98f1adae423b7d5225bd04f979b2258cca2ccaec320329 not found: ID does not exist" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.740736 4721 scope.go:117] "RemoveContainer" containerID="4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.741055 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2"} err="failed to get container status \"4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2\": rpc error: code = NotFound desc = could not find container \"4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2\": container with ID starting with 4c64a38f5abe2c61595868db1d01effc7845b6eae80975a67d802574bacd5fa2 not found: ID does not exist" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.760707 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.814448 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.814513 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.814562 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.814609 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-scripts\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.814722 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpqt2\" (UniqueName: \"kubernetes.io/projected/7cdd3f19-3e66-4807-a0e8-957c713cef36-kube-api-access-vpqt2\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.814867 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-logs\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.814937 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-config-data\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.814967 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.918648 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.918726 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.918828 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.918916 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-scripts\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.918988 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpqt2\" (UniqueName: \"kubernetes.io/projected/7cdd3f19-3e66-4807-a0e8-957c713cef36-kube-api-access-vpqt2\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.919089 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-logs\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.919142 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-config-data\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.919168 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.919914 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.920515 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-logs\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.930798 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-scripts\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.930872 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.930920 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c7812605d9919b226d4340fce797cd8fb18c9c948d1e68864aa7eb7aeecf4816/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.937808 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.950227 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.950768 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-config-data\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:56 crc kubenswrapper[4721]: I0202 13:23:56.959130 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpqt2\" (UniqueName: \"kubernetes.io/projected/7cdd3f19-3e66-4807-a0e8-957c713cef36-kube-api-access-vpqt2\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.057778 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " pod="openstack/glance-default-external-api-0" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.298003 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.320672 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.429190 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-combined-ca-bundle\") pod \"bdd67c16-7130-4095-952f-006aa5bcd5bb\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.429314 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-config-data\") pod \"bdd67c16-7130-4095-952f-006aa5bcd5bb\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.429454 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-scripts\") pod \"bdd67c16-7130-4095-952f-006aa5bcd5bb\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.429514 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8kv\" (UniqueName: \"kubernetes.io/projected/bdd67c16-7130-4095-952f-006aa5bcd5bb-kube-api-access-fc8kv\") pod \"bdd67c16-7130-4095-952f-006aa5bcd5bb\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.429547 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdd67c16-7130-4095-952f-006aa5bcd5bb-logs\") pod \"bdd67c16-7130-4095-952f-006aa5bcd5bb\" (UID: \"bdd67c16-7130-4095-952f-006aa5bcd5bb\") " Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.431245 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdd67c16-7130-4095-952f-006aa5bcd5bb-logs" (OuterVolumeSpecName: "logs") pod "bdd67c16-7130-4095-952f-006aa5bcd5bb" (UID: "bdd67c16-7130-4095-952f-006aa5bcd5bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.444233 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-scripts" (OuterVolumeSpecName: "scripts") pod "bdd67c16-7130-4095-952f-006aa5bcd5bb" (UID: "bdd67c16-7130-4095-952f-006aa5bcd5bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.445760 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdd67c16-7130-4095-952f-006aa5bcd5bb-kube-api-access-fc8kv" (OuterVolumeSpecName: "kube-api-access-fc8kv") pod "bdd67c16-7130-4095-952f-006aa5bcd5bb" (UID: "bdd67c16-7130-4095-952f-006aa5bcd5bb"). InnerVolumeSpecName "kube-api-access-fc8kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.486503 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdd67c16-7130-4095-952f-006aa5bcd5bb" (UID: "bdd67c16-7130-4095-952f-006aa5bcd5bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.501297 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-config-data" (OuterVolumeSpecName: "config-data") pod "bdd67c16-7130-4095-952f-006aa5bcd5bb" (UID: "bdd67c16-7130-4095-952f-006aa5bcd5bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.536145 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8kv\" (UniqueName: \"kubernetes.io/projected/bdd67c16-7130-4095-952f-006aa5bcd5bb-kube-api-access-fc8kv\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.536185 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdd67c16-7130-4095-952f-006aa5bcd5bb-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.536198 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.536209 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.536219 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdd67c16-7130-4095-952f-006aa5bcd5bb-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.664050 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-86z2v" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.664108 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-86z2v" event={"ID":"bdd67c16-7130-4095-952f-006aa5bcd5bb","Type":"ContainerDied","Data":"42fcaa03785c97d077f0697f3384b2a0825c5d2c67e774961ea3ccb33c705ea9"} Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.676277 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42fcaa03785c97d077f0697f3384b2a0825c5d2c67e774961ea3ccb33c705ea9" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.691355 4721 generic.go:334] "Generic (PLEG): container finished" podID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerID="6863179c724438e853ded7b834d2693a1bdec25fbe0b4d4e3ec7a6bb5320d8bb" exitCode=0 Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.691396 4721 generic.go:334] "Generic (PLEG): container finished" podID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerID="9b2f14ad59791ba7c4b7509a93d19935150acbbd21de442df3a99c78406b90d5" exitCode=143 Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.691422 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2651c902-94b1-4da3-b03f-cd5aee83749b","Type":"ContainerDied","Data":"6863179c724438e853ded7b834d2693a1bdec25fbe0b4d4e3ec7a6bb5320d8bb"} Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.691456 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2651c902-94b1-4da3-b03f-cd5aee83749b","Type":"ContainerDied","Data":"9b2f14ad59791ba7c4b7509a93d19935150acbbd21de442df3a99c78406b90d5"} Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.732244 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-75b75c495b-kpsxz"] Feb 02 13:23:57 crc kubenswrapper[4721]: E0202 13:23:57.732894 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdd67c16-7130-4095-952f-006aa5bcd5bb" containerName="placement-db-sync" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.732921 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdd67c16-7130-4095-952f-006aa5bcd5bb" containerName="placement-db-sync" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.733227 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdd67c16-7130-4095-952f-006aa5bcd5bb" containerName="placement-db-sync" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.739279 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.753824 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.754061 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.754231 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tfr2p" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.754242 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.754366 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.788989 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75b75c495b-kpsxz"] Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.848328 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873ec78b-5777-4560-a744-c4789b43d966-logs\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.848372 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-config-data\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.848415 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngbhh\" (UniqueName: \"kubernetes.io/projected/873ec78b-5777-4560-a744-c4789b43d966-kube-api-access-ngbhh\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.848607 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-scripts\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.848837 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-combined-ca-bundle\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.848886 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-public-tls-certs\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.848919 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-internal-tls-certs\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.950692 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873ec78b-5777-4560-a744-c4789b43d966-logs\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.951009 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-config-data\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.951048 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngbhh\" (UniqueName: \"kubernetes.io/projected/873ec78b-5777-4560-a744-c4789b43d966-kube-api-access-ngbhh\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.951150 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-scripts\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.951286 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-combined-ca-bundle\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.951325 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-public-tls-certs\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.951357 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-internal-tls-certs\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.951584 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873ec78b-5777-4560-a744-c4789b43d966-logs\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.957495 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-internal-tls-certs\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.958155 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-config-data\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.959384 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-scripts\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.960041 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-combined-ca-bundle\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.963218 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-public-tls-certs\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:57 crc kubenswrapper[4721]: I0202 13:23:57.970880 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngbhh\" (UniqueName: \"kubernetes.io/projected/873ec78b-5777-4560-a744-c4789b43d966-kube-api-access-ngbhh\") pod \"placement-75b75c495b-kpsxz\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.054545 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.148719 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.158963 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-logs\") pod \"2651c902-94b1-4da3-b03f-cd5aee83749b\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.159045 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgt6x\" (UniqueName: \"kubernetes.io/projected/2651c902-94b1-4da3-b03f-cd5aee83749b-kube-api-access-jgt6x\") pod \"2651c902-94b1-4da3-b03f-cd5aee83749b\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.159178 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-scripts\") pod \"2651c902-94b1-4da3-b03f-cd5aee83749b\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.159221 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-config-data\") pod \"2651c902-94b1-4da3-b03f-cd5aee83749b\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.159336 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"2651c902-94b1-4da3-b03f-cd5aee83749b\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.159404 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-combined-ca-bundle\") pod \"2651c902-94b1-4da3-b03f-cd5aee83749b\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.159566 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-httpd-run\") pod \"2651c902-94b1-4da3-b03f-cd5aee83749b\" (UID: \"2651c902-94b1-4da3-b03f-cd5aee83749b\") " Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.181672 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-logs" (OuterVolumeSpecName: "logs") pod "2651c902-94b1-4da3-b03f-cd5aee83749b" (UID: "2651c902-94b1-4da3-b03f-cd5aee83749b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.191480 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2651c902-94b1-4da3-b03f-cd5aee83749b" (UID: "2651c902-94b1-4da3-b03f-cd5aee83749b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.202813 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2651c902-94b1-4da3-b03f-cd5aee83749b-kube-api-access-jgt6x" (OuterVolumeSpecName: "kube-api-access-jgt6x") pod "2651c902-94b1-4da3-b03f-cd5aee83749b" (UID: "2651c902-94b1-4da3-b03f-cd5aee83749b"). InnerVolumeSpecName "kube-api-access-jgt6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.202908 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-scripts" (OuterVolumeSpecName: "scripts") pod "2651c902-94b1-4da3-b03f-cd5aee83749b" (UID: "2651c902-94b1-4da3-b03f-cd5aee83749b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.222271 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2651c902-94b1-4da3-b03f-cd5aee83749b" (UID: "2651c902-94b1-4da3-b03f-cd5aee83749b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.249290 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79" (OuterVolumeSpecName: "glance") pod "2651c902-94b1-4da3-b03f-cd5aee83749b" (UID: "2651c902-94b1-4da3-b03f-cd5aee83749b"). InnerVolumeSpecName "pvc-56daf3c3-162c-4970-aab6-c4cecea22e79". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.263197 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgt6x\" (UniqueName: \"kubernetes.io/projected/2651c902-94b1-4da3-b03f-cd5aee83749b-kube-api-access-jgt6x\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.263227 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.263255 4721 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") on node \"crc\" " Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.263265 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.263275 4721 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.263286 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2651c902-94b1-4da3-b03f-cd5aee83749b-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.280218 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:23:58 crc kubenswrapper[4721]: W0202 13:23:58.286978 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cdd3f19_3e66_4807_a0e8_957c713cef36.slice/crio-57b27fa9d3d382916d0fc10e2c5127febea2102b5b13a8f86a12bdcd66a461e9 WatchSource:0}: Error finding container 57b27fa9d3d382916d0fc10e2c5127febea2102b5b13a8f86a12bdcd66a461e9: Status 404 returned error can't find the container with id 57b27fa9d3d382916d0fc10e2c5127febea2102b5b13a8f86a12bdcd66a461e9 Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.301234 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-config-data" (OuterVolumeSpecName: "config-data") pod "2651c902-94b1-4da3-b03f-cd5aee83749b" (UID: "2651c902-94b1-4da3-b03f-cd5aee83749b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.303271 4721 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.303521 4721 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-56daf3c3-162c-4970-aab6-c4cecea22e79" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79") on node "crc" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.366874 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2651c902-94b1-4da3-b03f-cd5aee83749b-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.367288 4721 reconciler_common.go:293] "Volume detached for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") on node \"crc\" DevicePath \"\"" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.453392 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231734f6-4050-4aff-92a2-a92982428b95" path="/var/lib/kubelet/pods/231734f6-4050-4aff-92a2-a92982428b95/volumes" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.706719 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cdd3f19-3e66-4807-a0e8-957c713cef36","Type":"ContainerStarted","Data":"57b27fa9d3d382916d0fc10e2c5127febea2102b5b13a8f86a12bdcd66a461e9"} Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.713007 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2651c902-94b1-4da3-b03f-cd5aee83749b","Type":"ContainerDied","Data":"de0fa32e0591d0c591d4364744d36cc03bd323f3bbad2d10abfa57cf3278568e"} Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.713091 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.713113 4721 scope.go:117] "RemoveContainer" containerID="6863179c724438e853ded7b834d2693a1bdec25fbe0b4d4e3ec7a6bb5320d8bb" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.748036 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75b75c495b-kpsxz"] Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.943453 4721 scope.go:117] "RemoveContainer" containerID="9b2f14ad59791ba7c4b7509a93d19935150acbbd21de442df3a99c78406b90d5" Feb 02 13:23:58 crc kubenswrapper[4721]: I0202 13:23:58.989437 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.027159 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.038143 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:23:59 crc kubenswrapper[4721]: E0202 13:23:59.038814 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerName="glance-log" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.038834 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerName="glance-log" Feb 02 13:23:59 crc kubenswrapper[4721]: E0202 13:23:59.038899 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerName="glance-httpd" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.038910 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerName="glance-httpd" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.039198 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerName="glance-httpd" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.039222 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="2651c902-94b1-4da3-b03f-cd5aee83749b" containerName="glance-log" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.040783 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.043886 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.044707 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.056825 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.092958 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6x5k\" (UniqueName: \"kubernetes.io/projected/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-kube-api-access-r6x5k\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.093084 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.093183 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.093510 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.093563 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.093644 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.093688 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-logs\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.093929 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.195905 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.195956 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.195996 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.196021 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-logs\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.196119 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.196168 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6x5k\" (UniqueName: \"kubernetes.io/projected/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-kube-api-access-r6x5k\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.196200 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.196253 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.197464 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.198216 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-logs\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.202610 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.202690 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f813ebfbde533117d7c5539927c335132efd30cff3e1cb355d78cb9d4c1a927/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.204248 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.205524 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.206419 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.221813 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.233957 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6x5k\" (UniqueName: \"kubernetes.io/projected/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-kube-api-access-r6x5k\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.255605 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.389912 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.741448 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cdd3f19-3e66-4807-a0e8-957c713cef36","Type":"ContainerStarted","Data":"60d5f5c74197287dc48184ed76b8f6d34b4d23a85669a6b5c075668ede91ce31"} Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.749841 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b75c495b-kpsxz" event={"ID":"873ec78b-5777-4560-a744-c4789b43d966","Type":"ContainerStarted","Data":"be87e0c26b43f57fe8bd6e999c716fddff3bcf7e4eec9b29a39b86ceafba0594"} Feb 02 13:23:59 crc kubenswrapper[4721]: I0202 13:23:59.749894 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b75c495b-kpsxz" event={"ID":"873ec78b-5777-4560-a744-c4789b43d966","Type":"ContainerStarted","Data":"bef5ef7eca11516e2b9cce9579ac419ba493f62c89b699687290738232336cce"} Feb 02 13:24:00 crc kubenswrapper[4721]: I0202 13:24:00.423524 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2651c902-94b1-4da3-b03f-cd5aee83749b" path="/var/lib/kubelet/pods/2651c902-94b1-4da3-b03f-cd5aee83749b/volumes" Feb 02 13:24:00 crc kubenswrapper[4721]: I0202 13:24:00.760919 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:24:00 crc kubenswrapper[4721]: I0202 13:24:00.830808 4721 generic.go:334] "Generic (PLEG): container finished" podID="d168e414-ab7e-45ad-b142-25dcc1c359b0" containerID="3705d645077158cd12edf8f0f9b5a39f0ba95d5854f57c056a964be7f2bc24c9" exitCode=0 Feb 02 13:24:00 crc kubenswrapper[4721]: I0202 13:24:00.830908 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dw7nl" event={"ID":"d168e414-ab7e-45ad-b142-25dcc1c359b0","Type":"ContainerDied","Data":"3705d645077158cd12edf8f0f9b5a39f0ba95d5854f57c056a964be7f2bc24c9"} Feb 02 13:24:00 crc kubenswrapper[4721]: I0202 13:24:00.905487 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-x4f5m"] Feb 02 13:24:00 crc kubenswrapper[4721]: I0202 13:24:00.905748 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-x4f5m" podUID="a12cebe8-c719-4841-8d01-e9faf9b745cf" containerName="dnsmasq-dns" containerID="cri-o://76fa42f68bb28a61e9cdddef88da78612bb32973f43a836c40f835e5ed6d0856" gracePeriod=10 Feb 02 13:24:01 crc kubenswrapper[4721]: I0202 13:24:01.843772 4721 generic.go:334] "Generic (PLEG): container finished" podID="a12cebe8-c719-4841-8d01-e9faf9b745cf" containerID="76fa42f68bb28a61e9cdddef88da78612bb32973f43a836c40f835e5ed6d0856" exitCode=0 Feb 02 13:24:01 crc kubenswrapper[4721]: I0202 13:24:01.844324 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-x4f5m" event={"ID":"a12cebe8-c719-4841-8d01-e9faf9b745cf","Type":"ContainerDied","Data":"76fa42f68bb28a61e9cdddef88da78612bb32973f43a836c40f835e5ed6d0856"} Feb 02 13:24:01 crc kubenswrapper[4721]: I0202 13:24:01.894824 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-x4f5m" podUID="a12cebe8-c719-4841-8d01-e9faf9b745cf" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.444706 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.485025 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612109 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzdbz\" (UniqueName: \"kubernetes.io/projected/d168e414-ab7e-45ad-b142-25dcc1c359b0-kube-api-access-bzdbz\") pod \"d168e414-ab7e-45ad-b142-25dcc1c359b0\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612240 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-sb\") pod \"a12cebe8-c719-4841-8d01-e9faf9b745cf\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612319 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-dns-svc\") pod \"a12cebe8-c719-4841-8d01-e9faf9b745cf\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612364 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-config\") pod \"a12cebe8-c719-4841-8d01-e9faf9b745cf\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612383 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2hmg\" (UniqueName: \"kubernetes.io/projected/a12cebe8-c719-4841-8d01-e9faf9b745cf-kube-api-access-h2hmg\") pod \"a12cebe8-c719-4841-8d01-e9faf9b745cf\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612441 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-credential-keys\") pod \"d168e414-ab7e-45ad-b142-25dcc1c359b0\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612464 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-fernet-keys\") pod \"d168e414-ab7e-45ad-b142-25dcc1c359b0\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612511 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-config-data\") pod \"d168e414-ab7e-45ad-b142-25dcc1c359b0\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612558 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-scripts\") pod \"d168e414-ab7e-45ad-b142-25dcc1c359b0\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612640 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-combined-ca-bundle\") pod \"d168e414-ab7e-45ad-b142-25dcc1c359b0\" (UID: \"d168e414-ab7e-45ad-b142-25dcc1c359b0\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.612723 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-nb\") pod \"a12cebe8-c719-4841-8d01-e9faf9b745cf\" (UID: \"a12cebe8-c719-4841-8d01-e9faf9b745cf\") " Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.638557 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d168e414-ab7e-45ad-b142-25dcc1c359b0-kube-api-access-bzdbz" (OuterVolumeSpecName: "kube-api-access-bzdbz") pod "d168e414-ab7e-45ad-b142-25dcc1c359b0" (UID: "d168e414-ab7e-45ad-b142-25dcc1c359b0"). InnerVolumeSpecName "kube-api-access-bzdbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.651007 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12cebe8-c719-4841-8d01-e9faf9b745cf-kube-api-access-h2hmg" (OuterVolumeSpecName: "kube-api-access-h2hmg") pod "a12cebe8-c719-4841-8d01-e9faf9b745cf" (UID: "a12cebe8-c719-4841-8d01-e9faf9b745cf"). InnerVolumeSpecName "kube-api-access-h2hmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.674052 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-scripts" (OuterVolumeSpecName: "scripts") pod "d168e414-ab7e-45ad-b142-25dcc1c359b0" (UID: "d168e414-ab7e-45ad-b142-25dcc1c359b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.675402 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mlbxn"] Feb 02 13:24:03 crc kubenswrapper[4721]: E0202 13:24:03.677030 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12cebe8-c719-4841-8d01-e9faf9b745cf" containerName="dnsmasq-dns" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.677052 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12cebe8-c719-4841-8d01-e9faf9b745cf" containerName="dnsmasq-dns" Feb 02 13:24:03 crc kubenswrapper[4721]: E0202 13:24:03.677081 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d168e414-ab7e-45ad-b142-25dcc1c359b0" containerName="keystone-bootstrap" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.677089 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d168e414-ab7e-45ad-b142-25dcc1c359b0" containerName="keystone-bootstrap" Feb 02 13:24:03 crc kubenswrapper[4721]: E0202 13:24:03.677138 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12cebe8-c719-4841-8d01-e9faf9b745cf" containerName="init" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.677152 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12cebe8-c719-4841-8d01-e9faf9b745cf" containerName="init" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.677408 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="d168e414-ab7e-45ad-b142-25dcc1c359b0" containerName="keystone-bootstrap" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.677429 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12cebe8-c719-4841-8d01-e9faf9b745cf" containerName="dnsmasq-dns" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.678376 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d168e414-ab7e-45ad-b142-25dcc1c359b0" (UID: "d168e414-ab7e-45ad-b142-25dcc1c359b0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.681651 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.694274 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mlbxn"] Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.717288 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d168e414-ab7e-45ad-b142-25dcc1c359b0" (UID: "d168e414-ab7e-45ad-b142-25dcc1c359b0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.722900 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzdbz\" (UniqueName: \"kubernetes.io/projected/d168e414-ab7e-45ad-b142-25dcc1c359b0-kube-api-access-bzdbz\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.723054 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2hmg\" (UniqueName: \"kubernetes.io/projected/a12cebe8-c719-4841-8d01-e9faf9b745cf-kube-api-access-h2hmg\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.723252 4721 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.723347 4721 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.723407 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.826508 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmwvt\" (UniqueName: \"kubernetes.io/projected/37372b76-ef54-4a44-9b56-dea754373219-kube-api-access-dmwvt\") pod \"redhat-operators-mlbxn\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.827286 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-utilities\") pod \"redhat-operators-mlbxn\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.827500 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-catalog-content\") pod \"redhat-operators-mlbxn\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.891768 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-x4f5m" event={"ID":"a12cebe8-c719-4841-8d01-e9faf9b745cf","Type":"ContainerDied","Data":"521bca06923fe78f1ba71782b798785e3b87c44a41b5a65d319554abd4047afe"} Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.891828 4721 scope.go:117] "RemoveContainer" containerID="76fa42f68bb28a61e9cdddef88da78612bb32973f43a836c40f835e5ed6d0856" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.891975 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-x4f5m" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.908221 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dw7nl" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.909017 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dw7nl" event={"ID":"d168e414-ab7e-45ad-b142-25dcc1c359b0","Type":"ContainerDied","Data":"b5e597a9c6809a0c74e15c50feaa328c23d1128cb144bc36612339923f03dd73"} Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.909049 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5e597a9c6809a0c74e15c50feaa328c23d1128cb144bc36612339923f03dd73" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.926166 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5","Type":"ContainerStarted","Data":"4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d"} Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.931457 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmwvt\" (UniqueName: \"kubernetes.io/projected/37372b76-ef54-4a44-9b56-dea754373219-kube-api-access-dmwvt\") pod \"redhat-operators-mlbxn\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.931630 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-utilities\") pod \"redhat-operators-mlbxn\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.931681 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-catalog-content\") pod \"redhat-operators-mlbxn\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.934053 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-utilities\") pod \"redhat-operators-mlbxn\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.936471 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-catalog-content\") pod \"redhat-operators-mlbxn\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.954537 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-config-data" (OuterVolumeSpecName: "config-data") pod "d168e414-ab7e-45ad-b142-25dcc1c359b0" (UID: "d168e414-ab7e-45ad-b142-25dcc1c359b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.967952 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmwvt\" (UniqueName: \"kubernetes.io/projected/37372b76-ef54-4a44-9b56-dea754373219-kube-api-access-dmwvt\") pod \"redhat-operators-mlbxn\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.972349 4721 scope.go:117] "RemoveContainer" containerID="b3aeeebb46496223c552a8ed7c33309ec906c0d30db8ba232bc642182832692e" Feb 02 13:24:03 crc kubenswrapper[4721]: I0202 13:24:03.984545 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a12cebe8-c719-4841-8d01-e9faf9b745cf" (UID: "a12cebe8-c719-4841-8d01-e9faf9b745cf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.022809 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.024531 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a12cebe8-c719-4841-8d01-e9faf9b745cf" (UID: "a12cebe8-c719-4841-8d01-e9faf9b745cf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.043426 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.044477 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.044509 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.081312 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.148971 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a12cebe8-c719-4841-8d01-e9faf9b745cf" (UID: "a12cebe8-c719-4841-8d01-e9faf9b745cf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.167190 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d168e414-ab7e-45ad-b142-25dcc1c359b0" (UID: "d168e414-ab7e-45ad-b142-25dcc1c359b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.182736 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-config" (OuterVolumeSpecName: "config") pod "a12cebe8-c719-4841-8d01-e9faf9b745cf" (UID: "a12cebe8-c719-4841-8d01-e9faf9b745cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.251362 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d168e414-ab7e-45ad-b142-25dcc1c359b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.251416 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.251425 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12cebe8-c719-4841-8d01-e9faf9b745cf-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.584950 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-x4f5m"] Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.610235 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-x4f5m"] Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.627446 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-784866f846-pjz9x"] Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.629068 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.635112 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.635920 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.636074 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.636260 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7z9s7" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.636333 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.650752 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.651383 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-784866f846-pjz9x"] Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.756018 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mlbxn"] Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.808305 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pm8z\" (UniqueName: \"kubernetes.io/projected/5883cb27-6bc8-4309-aeac-64a54a46eb89-kube-api-access-5pm8z\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.808563 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-scripts\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.808621 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-internal-tls-certs\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.808643 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-public-tls-certs\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.808686 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-combined-ca-bundle\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.808752 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-fernet-keys\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.808780 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-config-data\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.808831 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-credential-keys\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.910826 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-internal-tls-certs\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.910891 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-public-tls-certs\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.910956 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-combined-ca-bundle\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.911046 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-fernet-keys\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.911113 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-config-data\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.911186 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-credential-keys\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.911269 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pm8z\" (UniqueName: \"kubernetes.io/projected/5883cb27-6bc8-4309-aeac-64a54a46eb89-kube-api-access-5pm8z\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.911302 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-scripts\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.929111 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-internal-tls-certs\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.939213 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-scripts\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.966501 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-credential-keys\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.966847 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-public-tls-certs\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.967730 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-fernet-keys\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.967909 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-combined-ca-bundle\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.970920 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5883cb27-6bc8-4309-aeac-64a54a46eb89-config-data\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.971695 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pm8z\" (UniqueName: \"kubernetes.io/projected/5883cb27-6bc8-4309-aeac-64a54a46eb89-kube-api-access-5pm8z\") pod \"keystone-784866f846-pjz9x\" (UID: \"5883cb27-6bc8-4309-aeac-64a54a46eb89\") " pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:04 crc kubenswrapper[4721]: I0202 13:24:04.994222 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:05 crc kubenswrapper[4721]: I0202 13:24:05.153590 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cgqfl" event={"ID":"47a4176b-5f58-47a9-a614-e5d05526da18","Type":"ContainerStarted","Data":"2f2b028f4f0c88964c0238ef71f7a14ee0e0d63a6586667e0a8a76c80b585914"} Feb 02 13:24:05 crc kubenswrapper[4721]: I0202 13:24:05.177775 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b75c495b-kpsxz" event={"ID":"873ec78b-5777-4560-a744-c4789b43d966","Type":"ContainerStarted","Data":"9340d886174492f6ae0a15f3f3e7b662045af6f27123113fd5b1ff49e71bab73"} Feb 02 13:24:05 crc kubenswrapper[4721]: I0202 13:24:05.179359 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:24:05 crc kubenswrapper[4721]: I0202 13:24:05.211309 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlbxn" event={"ID":"37372b76-ef54-4a44-9b56-dea754373219","Type":"ContainerStarted","Data":"527e8468434eea06358e3d4622c114662919b1ba98ef618fb71f16dfc7759e5a"} Feb 02 13:24:05 crc kubenswrapper[4721]: I0202 13:24:05.227454 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-cgqfl" podStartSLOduration=4.675115627 podStartE2EDuration="46.227435362s" podCreationTimestamp="2026-02-02 13:23:19 +0000 UTC" firstStartedPulling="2026-02-02 13:23:21.764751597 +0000 UTC m=+1342.067265986" lastFinishedPulling="2026-02-02 13:24:03.317071332 +0000 UTC m=+1383.619585721" observedRunningTime="2026-02-02 13:24:05.20265155 +0000 UTC m=+1385.505165949" watchObservedRunningTime="2026-02-02 13:24:05.227435362 +0000 UTC m=+1385.529949741" Feb 02 13:24:05 crc kubenswrapper[4721]: I0202 13:24:05.265465 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8075cf6d-3ae0-468e-98cb-5f341d78b8ac","Type":"ContainerStarted","Data":"901860dc4f2cb59ed85a54f8bc10b9859a36c07381edb1151f125b84138e4df8"} Feb 02 13:24:05 crc kubenswrapper[4721]: I0202 13:24:05.275625 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-75b75c495b-kpsxz" podStartSLOduration=8.275607139 podStartE2EDuration="8.275607139s" podCreationTimestamp="2026-02-02 13:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:05.261591919 +0000 UTC m=+1385.564106308" watchObservedRunningTime="2026-02-02 13:24:05.275607139 +0000 UTC m=+1385.578121528" Feb 02 13:24:05 crc kubenswrapper[4721]: I0202 13:24:05.298219 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cdd3f19-3e66-4807-a0e8-957c713cef36","Type":"ContainerStarted","Data":"e352ddf5546d205ff44a3674807cce6288e4d9d2631c1e97a5b81f79535b48d6"} Feb 02 13:24:05 crc kubenswrapper[4721]: I0202 13:24:05.928939 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-784866f846-pjz9x"] Feb 02 13:24:05 crc kubenswrapper[4721]: W0202 13:24:05.938550 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5883cb27_6bc8_4309_aeac_64a54a46eb89.slice/crio-3f1661e4f74176df80c70a1f77adafa36c5de27686ba51983a47e447156a65e1 WatchSource:0}: Error finding container 3f1661e4f74176df80c70a1f77adafa36c5de27686ba51983a47e447156a65e1: Status 404 returned error can't find the container with id 3f1661e4f74176df80c70a1f77adafa36c5de27686ba51983a47e447156a65e1 Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.092179 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b2blk"] Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.095751 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.114620 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2blk"] Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.199525 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqx6q\" (UniqueName: \"kubernetes.io/projected/cb085bc7-03fe-45d5-8293-754aa8a47e79-kube-api-access-dqx6q\") pod \"community-operators-b2blk\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.199634 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-catalog-content\") pod \"community-operators-b2blk\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.199726 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-utilities\") pod \"community-operators-b2blk\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.305316 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqx6q\" (UniqueName: \"kubernetes.io/projected/cb085bc7-03fe-45d5-8293-754aa8a47e79-kube-api-access-dqx6q\") pod \"community-operators-b2blk\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.305765 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-catalog-content\") pod \"community-operators-b2blk\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.306543 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-catalog-content\") pod \"community-operators-b2blk\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.307222 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-utilities\") pod \"community-operators-b2blk\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.307634 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-utilities\") pod \"community-operators-b2blk\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.325399 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqx6q\" (UniqueName: \"kubernetes.io/projected/cb085bc7-03fe-45d5-8293-754aa8a47e79-kube-api-access-dqx6q\") pod \"community-operators-b2blk\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.329185 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-784866f846-pjz9x" event={"ID":"5883cb27-6bc8-4309-aeac-64a54a46eb89","Type":"ContainerStarted","Data":"3f1661e4f74176df80c70a1f77adafa36c5de27686ba51983a47e447156a65e1"} Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.330035 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.423136 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12cebe8-c719-4841-8d01-e9faf9b745cf" path="/var/lib/kubelet/pods/a12cebe8-c719-4841-8d01-e9faf9b745cf/volumes" Feb 02 13:24:06 crc kubenswrapper[4721]: I0202 13:24:06.600187 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:07 crc kubenswrapper[4721]: W0202 13:24:07.261032 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb085bc7_03fe_45d5_8293_754aa8a47e79.slice/crio-ca306f9ea887ebe52384a1958a49ecea30993901251ae5049666a5cc6680fe83 WatchSource:0}: Error finding container ca306f9ea887ebe52384a1958a49ecea30993901251ae5049666a5cc6680fe83: Status 404 returned error can't find the container with id ca306f9ea887ebe52384a1958a49ecea30993901251ae5049666a5cc6680fe83 Feb 02 13:24:07 crc kubenswrapper[4721]: I0202 13:24:07.270135 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2blk"] Feb 02 13:24:07 crc kubenswrapper[4721]: I0202 13:24:07.348342 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:24:07 crc kubenswrapper[4721]: I0202 13:24:07.348282 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2blk" event={"ID":"cb085bc7-03fe-45d5-8293-754aa8a47e79","Type":"ContainerStarted","Data":"ca306f9ea887ebe52384a1958a49ecea30993901251ae5049666a5cc6680fe83"} Feb 02 13:24:07 crc kubenswrapper[4721]: I0202 13:24:07.388887 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.388864553 podStartE2EDuration="11.388864553s" podCreationTimestamp="2026-02-02 13:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:07.372248492 +0000 UTC m=+1387.674762881" watchObservedRunningTime="2026-02-02 13:24:07.388864553 +0000 UTC m=+1387.691378942" Feb 02 13:24:08 crc kubenswrapper[4721]: I0202 13:24:08.366289 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8075cf6d-3ae0-468e-98cb-5f341d78b8ac","Type":"ContainerStarted","Data":"6b2f4ace0eb75cc9899716f3900092d7c461a3838ae11f1c1f66ca5572586d0f"} Feb 02 13:24:08 crc kubenswrapper[4721]: I0202 13:24:08.372869 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-784866f846-pjz9x" event={"ID":"5883cb27-6bc8-4309-aeac-64a54a46eb89","Type":"ContainerStarted","Data":"9286b365d0d48470dda4022122eaf45ff6454371eba06e55cacc127143ef2b2e"} Feb 02 13:24:08 crc kubenswrapper[4721]: I0202 13:24:08.373228 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:08 crc kubenswrapper[4721]: I0202 13:24:08.384646 4721 generic.go:334] "Generic (PLEG): container finished" podID="37372b76-ef54-4a44-9b56-dea754373219" containerID="be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981" exitCode=0 Feb 02 13:24:08 crc kubenswrapper[4721]: I0202 13:24:08.384939 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlbxn" event={"ID":"37372b76-ef54-4a44-9b56-dea754373219","Type":"ContainerDied","Data":"be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981"} Feb 02 13:24:08 crc kubenswrapper[4721]: I0202 13:24:08.396821 4721 generic.go:334] "Generic (PLEG): container finished" podID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerID="5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396" exitCode=0 Feb 02 13:24:08 crc kubenswrapper[4721]: I0202 13:24:08.396876 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2blk" event={"ID":"cb085bc7-03fe-45d5-8293-754aa8a47e79","Type":"ContainerDied","Data":"5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396"} Feb 02 13:24:08 crc kubenswrapper[4721]: I0202 13:24:08.480278 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-784866f846-pjz9x" podStartSLOduration=4.480249412 podStartE2EDuration="4.480249412s" podCreationTimestamp="2026-02-02 13:24:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:08.403969223 +0000 UTC m=+1388.706483642" watchObservedRunningTime="2026-02-02 13:24:08.480249412 +0000 UTC m=+1388.782763821" Feb 02 13:24:09 crc kubenswrapper[4721]: I0202 13:24:09.434393 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8075cf6d-3ae0-468e-98cb-5f341d78b8ac","Type":"ContainerStarted","Data":"8d37382e602df793413e2180a90609111293c970770dc8adb79b0513cd945df9"} Feb 02 13:24:09 crc kubenswrapper[4721]: I0202 13:24:09.444142 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-n52pp" event={"ID":"9fa244a8-7588-4d87-bd5b-cbcd10780c83","Type":"ContainerStarted","Data":"0b784cae8dddc21c2c3af89d409032d5d888340b53c786c1dc27d600d257dd2b"} Feb 02 13:24:09 crc kubenswrapper[4721]: I0202 13:24:09.466119 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.466092819 podStartE2EDuration="11.466092819s" podCreationTimestamp="2026-02-02 13:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:09.463378375 +0000 UTC m=+1389.765892794" watchObservedRunningTime="2026-02-02 13:24:09.466092819 +0000 UTC m=+1389.768607218" Feb 02 13:24:09 crc kubenswrapper[4721]: I0202 13:24:09.497762 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-n52pp" podStartSLOduration=5.066205161 podStartE2EDuration="51.497739517s" podCreationTimestamp="2026-02-02 13:23:18 +0000 UTC" firstStartedPulling="2026-02-02 13:23:21.669051589 +0000 UTC m=+1341.971565978" lastFinishedPulling="2026-02-02 13:24:08.100585945 +0000 UTC m=+1388.403100334" observedRunningTime="2026-02-02 13:24:09.491666663 +0000 UTC m=+1389.794181052" watchObservedRunningTime="2026-02-02 13:24:09.497739517 +0000 UTC m=+1389.800253906" Feb 02 13:24:09 crc kubenswrapper[4721]: I0202 13:24:09.976514 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:24:11 crc kubenswrapper[4721]: I0202 13:24:11.482334 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7wjxh" event={"ID":"ad3578ef-5d1b-4c52-939c-237feadc1c5c","Type":"ContainerStarted","Data":"ce2db44950c758448aaea5320ccdad1fe422fd10d5dc9377dff5887076136a7a"} Feb 02 13:24:11 crc kubenswrapper[4721]: I0202 13:24:11.506665 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-7wjxh" podStartSLOduration=6.43815022 podStartE2EDuration="53.506646931s" podCreationTimestamp="2026-02-02 13:23:18 +0000 UTC" firstStartedPulling="2026-02-02 13:23:21.437748296 +0000 UTC m=+1341.740262695" lastFinishedPulling="2026-02-02 13:24:08.506245017 +0000 UTC m=+1388.808759406" observedRunningTime="2026-02-02 13:24:11.505797927 +0000 UTC m=+1391.808312316" watchObservedRunningTime="2026-02-02 13:24:11.506646931 +0000 UTC m=+1391.809161320" Feb 02 13:24:11 crc kubenswrapper[4721]: I0202 13:24:11.978702 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.263400 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6ccdcdf5fb-gncnr"] Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.265643 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.282955 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6ccdcdf5fb-gncnr"] Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.405127 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-public-tls-certs\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.405202 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-scripts\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.405277 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-internal-tls-certs\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.405359 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-config-data\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.405460 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-combined-ca-bundle\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.405610 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqwdb\" (UniqueName: \"kubernetes.io/projected/8e3f4574-6ad6-4b37-abf5-2005c8692a44-kube-api-access-jqwdb\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.405737 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e3f4574-6ad6-4b37-abf5-2005c8692a44-logs\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.495502 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlbxn" event={"ID":"37372b76-ef54-4a44-9b56-dea754373219","Type":"ContainerStarted","Data":"e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a"} Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.498214 4721 generic.go:334] "Generic (PLEG): container finished" podID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerID="1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f" exitCode=0 Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.498269 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2blk" event={"ID":"cb085bc7-03fe-45d5-8293-754aa8a47e79","Type":"ContainerDied","Data":"1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f"} Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.507510 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e3f4574-6ad6-4b37-abf5-2005c8692a44-logs\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.507590 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-public-tls-certs\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.507627 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-scripts\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.507703 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-internal-tls-certs\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.507748 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-config-data\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.507844 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-combined-ca-bundle\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.507930 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwdb\" (UniqueName: \"kubernetes.io/projected/8e3f4574-6ad6-4b37-abf5-2005c8692a44-kube-api-access-jqwdb\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.508139 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e3f4574-6ad6-4b37-abf5-2005c8692a44-logs\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.513857 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-public-tls-certs\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.604900 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-scripts\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.610691 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-internal-tls-certs\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.613825 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-config-data\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.615752 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3f4574-6ad6-4b37-abf5-2005c8692a44-combined-ca-bundle\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.616614 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwdb\" (UniqueName: \"kubernetes.io/projected/8e3f4574-6ad6-4b37-abf5-2005c8692a44-kube-api-access-jqwdb\") pod \"placement-6ccdcdf5fb-gncnr\" (UID: \"8e3f4574-6ad6-4b37-abf5-2005c8692a44\") " pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:12 crc kubenswrapper[4721]: I0202 13:24:12.884103 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:15 crc kubenswrapper[4721]: I0202 13:24:15.591900 4721 generic.go:334] "Generic (PLEG): container finished" podID="37372b76-ef54-4a44-9b56-dea754373219" containerID="e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a" exitCode=0 Feb 02 13:24:15 crc kubenswrapper[4721]: I0202 13:24:15.592009 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlbxn" event={"ID":"37372b76-ef54-4a44-9b56-dea754373219","Type":"ContainerDied","Data":"e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a"} Feb 02 13:24:16 crc kubenswrapper[4721]: I0202 13:24:16.606010 4721 generic.go:334] "Generic (PLEG): container finished" podID="47a4176b-5f58-47a9-a614-e5d05526da18" containerID="2f2b028f4f0c88964c0238ef71f7a14ee0e0d63a6586667e0a8a76c80b585914" exitCode=0 Feb 02 13:24:16 crc kubenswrapper[4721]: I0202 13:24:16.606098 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cgqfl" event={"ID":"47a4176b-5f58-47a9-a614-e5d05526da18","Type":"ContainerDied","Data":"2f2b028f4f0c88964c0238ef71f7a14ee0e0d63a6586667e0a8a76c80b585914"} Feb 02 13:24:17 crc kubenswrapper[4721]: I0202 13:24:17.298951 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 13:24:17 crc kubenswrapper[4721]: I0202 13:24:17.299378 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 13:24:17 crc kubenswrapper[4721]: I0202 13:24:17.344436 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 13:24:17 crc kubenswrapper[4721]: I0202 13:24:17.364767 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 13:24:17 crc kubenswrapper[4721]: I0202 13:24:17.620489 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 13:24:17 crc kubenswrapper[4721]: I0202 13:24:17.620530 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.442948 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.483485 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-db-sync-config-data\") pod \"47a4176b-5f58-47a9-a614-e5d05526da18\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.483549 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-777ht\" (UniqueName: \"kubernetes.io/projected/47a4176b-5f58-47a9-a614-e5d05526da18-kube-api-access-777ht\") pod \"47a4176b-5f58-47a9-a614-e5d05526da18\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.483643 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-combined-ca-bundle\") pod \"47a4176b-5f58-47a9-a614-e5d05526da18\" (UID: \"47a4176b-5f58-47a9-a614-e5d05526da18\") " Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.493964 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "47a4176b-5f58-47a9-a614-e5d05526da18" (UID: "47a4176b-5f58-47a9-a614-e5d05526da18"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.501619 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47a4176b-5f58-47a9-a614-e5d05526da18-kube-api-access-777ht" (OuterVolumeSpecName: "kube-api-access-777ht") pod "47a4176b-5f58-47a9-a614-e5d05526da18" (UID: "47a4176b-5f58-47a9-a614-e5d05526da18"). InnerVolumeSpecName "kube-api-access-777ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.588872 4721 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.588910 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-777ht\" (UniqueName: \"kubernetes.io/projected/47a4176b-5f58-47a9-a614-e5d05526da18-kube-api-access-777ht\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.616410 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47a4176b-5f58-47a9-a614-e5d05526da18" (UID: "47a4176b-5f58-47a9-a614-e5d05526da18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.674916 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cgqfl" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.675797 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cgqfl" event={"ID":"47a4176b-5f58-47a9-a614-e5d05526da18","Type":"ContainerDied","Data":"0e87512b76ceddbcd36259192a7d06b2c50b8fc83a99c2a6ecdeee04f9de5d79"} Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.675849 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e87512b76ceddbcd36259192a7d06b2c50b8fc83a99c2a6ecdeee04f9de5d79" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.696204 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47a4176b-5f58-47a9-a614-e5d05526da18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:18 crc kubenswrapper[4721]: E0202 13:24:18.780002 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.887838 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6ccdcdf5fb-gncnr"] Feb 02 13:24:18 crc kubenswrapper[4721]: W0202 13:24:18.895638 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e3f4574_6ad6_4b37_abf5_2005c8692a44.slice/crio-bcb6dddc37c6d2d5129e13f487cd4b9ed27217cc44ad8a2254b576030ce32e69 WatchSource:0}: Error finding container bcb6dddc37c6d2d5129e13f487cd4b9ed27217cc44ad8a2254b576030ce32e69: Status 404 returned error can't find the container with id bcb6dddc37c6d2d5129e13f487cd4b9ed27217cc44ad8a2254b576030ce32e69 Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.965143 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-d4595f9f9-4d2g5"] Feb 02 13:24:18 crc kubenswrapper[4721]: E0202 13:24:18.981351 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47a4176b-5f58-47a9-a614-e5d05526da18" containerName="barbican-db-sync" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.981387 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="47a4176b-5f58-47a9-a614-e5d05526da18" containerName="barbican-db-sync" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.981604 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="47a4176b-5f58-47a9-a614-e5d05526da18" containerName="barbican-db-sync" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.982799 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.986556 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.986571 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 02 13:24:18 crc kubenswrapper[4721]: I0202 13:24:18.987724 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-27rl5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.004169 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d4595f9f9-4d2g5"] Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.006143 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a7211b-9a15-4765-99e2-520bd1d62ff1-combined-ca-bundle\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.006233 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a7211b-9a15-4765-99e2-520bd1d62ff1-config-data\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.006283 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93a7211b-9a15-4765-99e2-520bd1d62ff1-config-data-custom\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.006427 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93a7211b-9a15-4765-99e2-520bd1d62ff1-logs\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.006596 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jff5n\" (UniqueName: \"kubernetes.io/projected/93a7211b-9a15-4765-99e2-520bd1d62ff1-kube-api-access-jff5n\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.021915 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6f4497866b-px6fz"] Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.023882 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.036031 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.056488 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f4497866b-px6fz"] Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.108763 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a7211b-9a15-4765-99e2-520bd1d62ff1-combined-ca-bundle\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.108851 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a7211b-9a15-4765-99e2-520bd1d62ff1-config-data\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.108896 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93a7211b-9a15-4765-99e2-520bd1d62ff1-config-data-custom\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.108997 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93a7211b-9a15-4765-99e2-520bd1d62ff1-logs\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.109063 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/755b5957-fcfa-486a-8e63-d562742d6650-config-data-custom\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.109119 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755b5957-fcfa-486a-8e63-d562742d6650-combined-ca-bundle\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.109151 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755b5957-fcfa-486a-8e63-d562742d6650-config-data\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.109180 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjcp4\" (UniqueName: \"kubernetes.io/projected/755b5957-fcfa-486a-8e63-d562742d6650-kube-api-access-sjcp4\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.109236 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/755b5957-fcfa-486a-8e63-d562742d6650-logs\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.109308 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jff5n\" (UniqueName: \"kubernetes.io/projected/93a7211b-9a15-4765-99e2-520bd1d62ff1-kube-api-access-jff5n\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.115942 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93a7211b-9a15-4765-99e2-520bd1d62ff1-combined-ca-bundle\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.116541 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93a7211b-9a15-4765-99e2-520bd1d62ff1-logs\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.119353 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93a7211b-9a15-4765-99e2-520bd1d62ff1-config-data\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.121841 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93a7211b-9a15-4765-99e2-520bd1d62ff1-config-data-custom\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.147800 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jff5n\" (UniqueName: \"kubernetes.io/projected/93a7211b-9a15-4765-99e2-520bd1d62ff1-kube-api-access-jff5n\") pod \"barbican-worker-d4595f9f9-4d2g5\" (UID: \"93a7211b-9a15-4765-99e2-520bd1d62ff1\") " pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.186376 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjf7x"] Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.188881 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.213386 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p7rn\" (UniqueName: \"kubernetes.io/projected/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-kube-api-access-5p7rn\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.213822 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.213887 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.217548 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/755b5957-fcfa-486a-8e63-d562742d6650-config-data-custom\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.217607 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755b5957-fcfa-486a-8e63-d562742d6650-combined-ca-bundle\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.217642 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-config\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.217673 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755b5957-fcfa-486a-8e63-d562742d6650-config-data\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.217705 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjcp4\" (UniqueName: \"kubernetes.io/projected/755b5957-fcfa-486a-8e63-d562742d6650-kube-api-access-sjcp4\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.217915 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/755b5957-fcfa-486a-8e63-d562742d6650-logs\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.217982 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-svc\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.218031 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.221631 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/755b5957-fcfa-486a-8e63-d562742d6650-logs\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.229862 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/755b5957-fcfa-486a-8e63-d562742d6650-config-data-custom\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.239859 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/755b5957-fcfa-486a-8e63-d562742d6650-config-data\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.256047 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/755b5957-fcfa-486a-8e63-d562742d6650-combined-ca-bundle\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.269232 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjcp4\" (UniqueName: \"kubernetes.io/projected/755b5957-fcfa-486a-8e63-d562742d6650-kube-api-access-sjcp4\") pod \"barbican-keystone-listener-6f4497866b-px6fz\" (UID: \"755b5957-fcfa-486a-8e63-d562742d6650\") " pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.276170 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjf7x"] Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.320592 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.320654 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.320711 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-config\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.320832 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-svc\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.320864 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.320900 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p7rn\" (UniqueName: \"kubernetes.io/projected/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-kube-api-access-5p7rn\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.322315 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.329178 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-svc\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.329778 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.329841 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-config\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.333187 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.339441 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d4595f9f9-4d2g5" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.365691 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p7rn\" (UniqueName: \"kubernetes.io/projected/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-kube-api-access-5p7rn\") pod \"dnsmasq-dns-85ff748b95-xjf7x\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.374142 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55dd659f54-28qsl"] Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.380491 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.392314 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55dd659f54-28qsl"] Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.392359 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.392382 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.393540 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.400389 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.423164 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.423246 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tv9\" (UniqueName: \"kubernetes.io/projected/0f119900-0b52-425a-be0a-0940a4747f89-kube-api-access-k7tv9\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.423360 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data-custom\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.423550 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-combined-ca-bundle\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.423653 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f119900-0b52-425a-be0a-0940a4747f89-logs\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.474518 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.530227 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.532817 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.532920 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tv9\" (UniqueName: \"kubernetes.io/projected/0f119900-0b52-425a-be0a-0940a4747f89-kube-api-access-k7tv9\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.533021 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data-custom\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.533239 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-combined-ca-bundle\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.571788 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f119900-0b52-425a-be0a-0940a4747f89-logs\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.587656 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f119900-0b52-425a-be0a-0940a4747f89-logs\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.651920 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.652148 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data-custom\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.654352 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tv9\" (UniqueName: \"kubernetes.io/projected/0f119900-0b52-425a-be0a-0940a4747f89-kube-api-access-k7tv9\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.680127 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-combined-ca-bundle\") pod \"barbican-api-55dd659f54-28qsl\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.691140 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.731558 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5","Type":"ContainerStarted","Data":"3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699"} Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.732290 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="ceilometer-notification-agent" containerID="cri-o://532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae" gracePeriod=30 Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.732614 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.733013 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="proxy-httpd" containerID="cri-o://3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699" gracePeriod=30 Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.733096 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="sg-core" containerID="cri-o://4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d" gracePeriod=30 Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.742114 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.790564 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlbxn" event={"ID":"37372b76-ef54-4a44-9b56-dea754373219","Type":"ContainerStarted","Data":"c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a"} Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.794543 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ccdcdf5fb-gncnr" event={"ID":"8e3f4574-6ad6-4b37-abf5-2005c8692a44","Type":"ContainerStarted","Data":"bcb6dddc37c6d2d5129e13f487cd4b9ed27217cc44ad8a2254b576030ce32e69"} Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.829320 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2blk" event={"ID":"cb085bc7-03fe-45d5-8293-754aa8a47e79","Type":"ContainerStarted","Data":"fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de"} Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.829373 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.829387 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.902769 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mlbxn" podStartSLOduration=6.902242617 podStartE2EDuration="16.90274051s" podCreationTimestamp="2026-02-02 13:24:03 +0000 UTC" firstStartedPulling="2026-02-02 13:24:08.388863554 +0000 UTC m=+1388.691377943" lastFinishedPulling="2026-02-02 13:24:18.389361447 +0000 UTC m=+1398.691875836" observedRunningTime="2026-02-02 13:24:19.828566539 +0000 UTC m=+1400.131080938" watchObservedRunningTime="2026-02-02 13:24:19.90274051 +0000 UTC m=+1400.205254909" Feb 02 13:24:19 crc kubenswrapper[4721]: I0202 13:24:19.920958 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b2blk" podStartSLOduration=3.906210125 podStartE2EDuration="13.920930724s" podCreationTimestamp="2026-02-02 13:24:06 +0000 UTC" firstStartedPulling="2026-02-02 13:24:08.409608406 +0000 UTC m=+1388.712122805" lastFinishedPulling="2026-02-02 13:24:18.424329015 +0000 UTC m=+1398.726843404" observedRunningTime="2026-02-02 13:24:19.864560865 +0000 UTC m=+1400.167075254" watchObservedRunningTime="2026-02-02 13:24:19.920930724 +0000 UTC m=+1400.223445113" Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.255294 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d4595f9f9-4d2g5"] Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.697938 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjf7x"] Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.710817 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f4497866b-px6fz"] Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.820417 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55dd659f54-28qsl"] Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.849478 4721 generic.go:334] "Generic (PLEG): container finished" podID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerID="3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699" exitCode=0 Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.849865 4721 generic.go:334] "Generic (PLEG): container finished" podID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerID="4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d" exitCode=2 Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.850001 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5","Type":"ContainerDied","Data":"3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699"} Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.850123 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5","Type":"ContainerDied","Data":"4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d"} Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.852460 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" event={"ID":"d2fa85eb-972d-4369-8103-dd4cd3e2b78a","Type":"ContainerStarted","Data":"99da6e129c53d9990311d147953c7a4dfacacce031a5f1e7ffc090745f296f41"} Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.869683 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ccdcdf5fb-gncnr" event={"ID":"8e3f4574-6ad6-4b37-abf5-2005c8692a44","Type":"ContainerStarted","Data":"d560cca08f2f347ec63e253e602295448f38febe23beccfbe7f50b89b4a78d01"} Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.871369 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dd659f54-28qsl" event={"ID":"0f119900-0b52-425a-be0a-0940a4747f89","Type":"ContainerStarted","Data":"5366c1b45533360e87afcbc53d2e0646f4fb35a11a7eba0ff655b53f1aa8c34c"} Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.872419 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" event={"ID":"755b5957-fcfa-486a-8e63-d562742d6650","Type":"ContainerStarted","Data":"0eceeef39bf1bbda63753bcc4682a336c5f016ce2fd121b0c3b2a127658126dd"} Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.880617 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:24:20 crc kubenswrapper[4721]: I0202 13:24:20.882755 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d4595f9f9-4d2g5" event={"ID":"93a7211b-9a15-4765-99e2-520bd1d62ff1","Type":"ContainerStarted","Data":"19578437d5b967a5d3ffcff4a282b024b837040179895aa6480087713455e07b"} Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.122547 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-59d9f7977f-7dt9k"] Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.122998 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-59d9f7977f-7dt9k" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-api" containerID="cri-o://da489fd3d8eceb37f5d84a7cb93a4298ee1e4fc025f5a8e61c61624f47e5cc8a" gracePeriod=30 Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.123759 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-59d9f7977f-7dt9k" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-httpd" containerID="cri-o://8b6cdf8693a8045891752ca1818e301a94afd2ff81089e5e3f78d2e62be53c12" gracePeriod=30 Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.142631 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-59d9f7977f-7dt9k" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.196:9696/\": EOF" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.200218 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-74cc678f5-fkzpw"] Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.202662 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.223819 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74cc678f5-fkzpw"] Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.255787 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-internal-tls-certs\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.255942 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-ovndb-tls-certs\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.255972 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-httpd-config\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.256009 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-combined-ca-bundle\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.256036 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzqz\" (UniqueName: \"kubernetes.io/projected/40093ddb-a585-427d-88f6-110b4ea07578-kube-api-access-pbzqz\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.256128 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-public-tls-certs\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.256275 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-config\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.358898 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-config\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.359029 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-internal-tls-certs\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.359186 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-ovndb-tls-certs\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.359214 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-httpd-config\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.359253 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-combined-ca-bundle\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.359284 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzqz\" (UniqueName: \"kubernetes.io/projected/40093ddb-a585-427d-88f6-110b4ea07578-kube-api-access-pbzqz\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.359355 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-public-tls-certs\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.363897 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-public-tls-certs\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.367005 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-config\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.371885 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-combined-ca-bundle\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.373572 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-internal-tls-certs\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.377323 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-ovndb-tls-certs\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.381684 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/40093ddb-a585-427d-88f6-110b4ea07578-httpd-config\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.393937 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzqz\" (UniqueName: \"kubernetes.io/projected/40093ddb-a585-427d-88f6-110b4ea07578-kube-api-access-pbzqz\") pod \"neutron-74cc678f5-fkzpw\" (UID: \"40093ddb-a585-427d-88f6-110b4ea07578\") " pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.493720 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.960314 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:24:21 crc kubenswrapper[4721]: I0202 13:24:21.984488 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dd659f54-28qsl" event={"ID":"0f119900-0b52-425a-be0a-0940a4747f89","Type":"ContainerStarted","Data":"a307dcbe4bd53270347a396194e4ad6fcdab051a06bc6fbbccd4da1332e2bfc7"} Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.038783 4721 generic.go:334] "Generic (PLEG): container finished" podID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerID="532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae" exitCode=0 Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.039192 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5","Type":"ContainerDied","Data":"532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae"} Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.039229 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5","Type":"ContainerDied","Data":"5372b8d7305b4393c88e00abc4c50b4b02eb1dd6564a279f35710dd3d18e6691"} Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.039251 4721 scope.go:117] "RemoveContainer" containerID="3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.039441 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.076996 4721 generic.go:334] "Generic (PLEG): container finished" podID="92544741-12fa-42ac-ba5b-67179ec9443b" containerID="8b6cdf8693a8045891752ca1818e301a94afd2ff81089e5e3f78d2e62be53c12" exitCode=0 Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.077536 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d9f7977f-7dt9k" event={"ID":"92544741-12fa-42ac-ba5b-67179ec9443b","Type":"ContainerDied","Data":"8b6cdf8693a8045891752ca1818e301a94afd2ff81089e5e3f78d2e62be53c12"} Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.094946 4721 generic.go:334] "Generic (PLEG): container finished" podID="9fa244a8-7588-4d87-bd5b-cbcd10780c83" containerID="0b784cae8dddc21c2c3af89d409032d5d888340b53c786c1dc27d600d257dd2b" exitCode=0 Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.095024 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-n52pp" event={"ID":"9fa244a8-7588-4d87-bd5b-cbcd10780c83","Type":"ContainerDied","Data":"0b784cae8dddc21c2c3af89d409032d5d888340b53c786c1dc27d600d257dd2b"} Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.100104 4721 generic.go:334] "Generic (PLEG): container finished" podID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" containerID="67e81c0f84720634dc560e673791d8be5a3f4506fb68a6b94459043bb3aa0bc2" exitCode=0 Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.100190 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" event={"ID":"d2fa85eb-972d-4369-8103-dd4cd3e2b78a","Type":"ContainerDied","Data":"67e81c0f84720634dc560e673791d8be5a3f4506fb68a6b94459043bb3aa0bc2"} Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.108512 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-scripts\") pod \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.108689 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-config-data\") pod \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.108741 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbbzr\" (UniqueName: \"kubernetes.io/projected/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-kube-api-access-fbbzr\") pod \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.108793 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-sg-core-conf-yaml\") pod \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.108842 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-combined-ca-bundle\") pod \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.108945 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-run-httpd\") pod \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.108989 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-log-httpd\") pod \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\" (UID: \"1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5\") " Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.117170 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" (UID: "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.119961 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-scripts" (OuterVolumeSpecName: "scripts") pod "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" (UID: "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.120357 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" (UID: "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.129054 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-kube-api-access-fbbzr" (OuterVolumeSpecName: "kube-api-access-fbbzr") pod "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" (UID: "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5"). InnerVolumeSpecName "kube-api-access-fbbzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.143294 4721 scope.go:117] "RemoveContainer" containerID="4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.153766 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ccdcdf5fb-gncnr" event={"ID":"8e3f4574-6ad6-4b37-abf5-2005c8692a44","Type":"ContainerStarted","Data":"e41e35e5f4a276a302bdac0022a1e859573103aaa2f1dd17bb86e728220bdc00"} Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.153813 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.153847 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.155118 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.155144 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.216481 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.216844 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbbzr\" (UniqueName: \"kubernetes.io/projected/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-kube-api-access-fbbzr\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.216955 4721 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.217054 4721 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.242667 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" (UID: "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.263594 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" (UID: "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.320683 4721 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.320746 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.379340 4721 scope.go:117] "RemoveContainer" containerID="532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.441891 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-config-data" (OuterVolumeSpecName: "config-data") pod "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" (UID: "1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.503594 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6ccdcdf5fb-gncnr" podStartSLOduration=10.503569338 podStartE2EDuration="10.503569338s" podCreationTimestamp="2026-02-02 13:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:22.232057354 +0000 UTC m=+1402.534571763" watchObservedRunningTime="2026-02-02 13:24:22.503569338 +0000 UTC m=+1402.806083737" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.508216 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-74cc678f5-fkzpw"] Feb 02 13:24:22 crc kubenswrapper[4721]: W0202 13:24:22.518429 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40093ddb_a585_427d_88f6_110b4ea07578.slice/crio-a586a050b289fbafeb1602921c5a9696cfc00666e5e7b8ca13447ea3c8fe03e1 WatchSource:0}: Error finding container a586a050b289fbafeb1602921c5a9696cfc00666e5e7b8ca13447ea3c8fe03e1: Status 404 returned error can't find the container with id a586a050b289fbafeb1602921c5a9696cfc00666e5e7b8ca13447ea3c8fe03e1 Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.527666 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.745274 4721 scope.go:117] "RemoveContainer" containerID="3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699" Feb 02 13:24:22 crc kubenswrapper[4721]: E0202 13:24:22.747784 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699\": container with ID starting with 3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699 not found: ID does not exist" containerID="3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.747844 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699"} err="failed to get container status \"3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699\": rpc error: code = NotFound desc = could not find container \"3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699\": container with ID starting with 3192e734a06fcd9f8d184e01215add4fd33d5d9771b0e9c27a5f5edaae278699 not found: ID does not exist" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.747879 4721 scope.go:117] "RemoveContainer" containerID="4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d" Feb 02 13:24:22 crc kubenswrapper[4721]: E0202 13:24:22.748300 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d\": container with ID starting with 4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d not found: ID does not exist" containerID="4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.748344 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d"} err="failed to get container status \"4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d\": rpc error: code = NotFound desc = could not find container \"4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d\": container with ID starting with 4ae676a120a0891127e9d7af3640c4f6801bac80535f6452a01c2c0a7c50780d not found: ID does not exist" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.748400 4721 scope.go:117] "RemoveContainer" containerID="532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae" Feb 02 13:24:22 crc kubenswrapper[4721]: E0202 13:24:22.751654 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae\": container with ID starting with 532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae not found: ID does not exist" containerID="532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.751694 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae"} err="failed to get container status \"532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae\": rpc error: code = NotFound desc = could not find container \"532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae\": container with ID starting with 532a8ed94d348484657193a2554795492b450f7f802dc97b0881265e4ef935ae not found: ID does not exist" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.781246 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.802307 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.819088 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:24:22 crc kubenswrapper[4721]: E0202 13:24:22.819934 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="proxy-httpd" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.819971 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="proxy-httpd" Feb 02 13:24:22 crc kubenswrapper[4721]: E0202 13:24:22.819993 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="ceilometer-notification-agent" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.819999 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="ceilometer-notification-agent" Feb 02 13:24:22 crc kubenswrapper[4721]: E0202 13:24:22.820008 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="sg-core" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.820028 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="sg-core" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.820338 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="sg-core" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.820362 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="ceilometer-notification-agent" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.820376 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" containerName="proxy-httpd" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.823651 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.826219 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.826622 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.831936 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.954598 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-run-httpd\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.954661 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.954711 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4mxr\" (UniqueName: \"kubernetes.io/projected/a1482d2e-b885-44bd-b679-109f0b9698ea-kube-api-access-c4mxr\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.954795 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-scripts\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.954847 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.954884 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-log-httpd\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:22 crc kubenswrapper[4721]: I0202 13:24:22.954910 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-config-data\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.058635 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-scripts\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.058754 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.058805 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-log-httpd\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.058827 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-config-data\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.059018 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-run-httpd\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.059088 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.059155 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4mxr\" (UniqueName: \"kubernetes.io/projected/a1482d2e-b885-44bd-b679-109f0b9698ea-kube-api-access-c4mxr\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.065999 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-log-httpd\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.068138 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-run-httpd\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.070777 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.073150 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.074182 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-config-data\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.074510 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-scripts\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.082295 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4mxr\" (UniqueName: \"kubernetes.io/projected/a1482d2e-b885-44bd-b679-109f0b9698ea-kube-api-access-c4mxr\") pod \"ceilometer-0\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.164676 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.195185 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" event={"ID":"d2fa85eb-972d-4369-8103-dd4cd3e2b78a","Type":"ContainerStarted","Data":"7517327cffd1477bc363977d8634c4cb01f4fc58a61eaaad4127a09cf6a0605e"} Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.195262 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.216510 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dd659f54-28qsl" event={"ID":"0f119900-0b52-425a-be0a-0940a4747f89","Type":"ContainerStarted","Data":"0d5c49970e3de4e3928ae616ad70c665a088d4d4722ff49627ed715494d6560a"} Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.217335 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.217542 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.237121 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-59d9f7977f-7dt9k" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.196:9696/\": dial tcp 10.217.0.196:9696: connect: connection refused" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.245270 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74cc678f5-fkzpw" event={"ID":"40093ddb-a585-427d-88f6-110b4ea07578","Type":"ContainerStarted","Data":"7f9687cede0a9de14646ab0f986c731284c501c40e9a386c0933f293af5436a2"} Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.245334 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74cc678f5-fkzpw" event={"ID":"40093ddb-a585-427d-88f6-110b4ea07578","Type":"ContainerStarted","Data":"a586a050b289fbafeb1602921c5a9696cfc00666e5e7b8ca13447ea3c8fe03e1"} Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.311885 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" podStartSLOduration=4.311866229 podStartE2EDuration="4.311866229s" podCreationTimestamp="2026-02-02 13:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:23.289491522 +0000 UTC m=+1403.592005921" watchObservedRunningTime="2026-02-02 13:24:23.311866229 +0000 UTC m=+1403.614380618" Feb 02 13:24:23 crc kubenswrapper[4721]: I0202 13:24:23.335606 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55dd659f54-28qsl" podStartSLOduration=4.335578722 podStartE2EDuration="4.335578722s" podCreationTimestamp="2026-02-02 13:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:23.33143535 +0000 UTC m=+1403.633949739" watchObservedRunningTime="2026-02-02 13:24:23.335578722 +0000 UTC m=+1403.638093131" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.024175 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.024550 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.186323 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.207823 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.207982 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.308646 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.326575 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-74cc678f5-fkzpw" event={"ID":"40093ddb-a585-427d-88f6-110b4ea07578","Type":"ContainerStarted","Data":"c993532ad5f15f60539ee35f7980604943df19a8ea9995c5b9dab7b406aa3121"} Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.327504 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.407321 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-74cc678f5-fkzpw" podStartSLOduration=3.407296809 podStartE2EDuration="3.407296809s" podCreationTimestamp="2026-02-02 13:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:24.380488391 +0000 UTC m=+1404.683002780" watchObservedRunningTime="2026-02-02 13:24:24.407296809 +0000 UTC m=+1404.709811198" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.452735 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5" path="/var/lib/kubelet/pods/1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5/volumes" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.508080 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57c58bbb98-gpbp2"] Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.510539 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.516639 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.517284 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.521931 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57c58bbb98-gpbp2"] Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.554193 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbxcv\" (UniqueName: \"kubernetes.io/projected/183927fe-ec27-461b-8284-3e71f5cb666a-kube-api-access-kbxcv\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.554303 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-config-data\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.554351 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-config-data-custom\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.554446 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-internal-tls-certs\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.554527 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-public-tls-certs\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.554671 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/183927fe-ec27-461b-8284-3e71f5cb666a-logs\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.554931 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-combined-ca-bundle\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.657490 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-combined-ca-bundle\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.657661 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbxcv\" (UniqueName: \"kubernetes.io/projected/183927fe-ec27-461b-8284-3e71f5cb666a-kube-api-access-kbxcv\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.657710 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-config-data\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.657744 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-config-data-custom\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.659670 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-internal-tls-certs\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.659906 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-public-tls-certs\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.660180 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/183927fe-ec27-461b-8284-3e71f5cb666a-logs\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.661422 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/183927fe-ec27-461b-8284-3e71f5cb666a-logs\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.665884 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-internal-tls-certs\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.668847 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-combined-ca-bundle\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.681129 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbxcv\" (UniqueName: \"kubernetes.io/projected/183927fe-ec27-461b-8284-3e71f5cb666a-kube-api-access-kbxcv\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.689401 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-config-data\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.698425 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-config-data-custom\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.711698 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/183927fe-ec27-461b-8284-3e71f5cb666a-public-tls-certs\") pod \"barbican-api-57c58bbb98-gpbp2\" (UID: \"183927fe-ec27-461b-8284-3e71f5cb666a\") " pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:24 crc kubenswrapper[4721]: I0202 13:24:24.858048 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:25 crc kubenswrapper[4721]: I0202 13:24:25.111272 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlbxn" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" probeResult="failure" output=< Feb 02 13:24:25 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:24:25 crc kubenswrapper[4721]: > Feb 02 13:24:25 crc kubenswrapper[4721]: I0202 13:24:25.290564 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 13:24:25 crc kubenswrapper[4721]: I0202 13:24:25.290685 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:24:25 crc kubenswrapper[4721]: I0202 13:24:25.293564 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 13:24:26 crc kubenswrapper[4721]: W0202 13:24:26.294452 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1482d2e_b885_44bd_b679_109f0b9698ea.slice/crio-c455b716179204c8c0342b961a933847d0497bb7d80b63f7c4dd9a07665bceb2 WatchSource:0}: Error finding container c455b716179204c8c0342b961a933847d0497bb7d80b63f7c4dd9a07665bceb2: Status 404 returned error can't find the container with id c455b716179204c8c0342b961a933847d0497bb7d80b63f7c4dd9a07665bceb2 Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.427624 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-n52pp" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.459674 4721 generic.go:334] "Generic (PLEG): container finished" podID="ad3578ef-5d1b-4c52-939c-237feadc1c5c" containerID="ce2db44950c758448aaea5320ccdad1fe422fd10d5dc9377dff5887076136a7a" exitCode=0 Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.459750 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7wjxh" event={"ID":"ad3578ef-5d1b-4c52-939c-237feadc1c5c","Type":"ContainerDied","Data":"ce2db44950c758448aaea5320ccdad1fe422fd10d5dc9377dff5887076136a7a"} Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.467678 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerStarted","Data":"c455b716179204c8c0342b961a933847d0497bb7d80b63f7c4dd9a07665bceb2"} Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.484413 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-n52pp" event={"ID":"9fa244a8-7588-4d87-bd5b-cbcd10780c83","Type":"ContainerDied","Data":"f8c8f50f7ba68fd272f974a962b3d958bc797bb63b6619f9ead2e2ffc4525a32"} Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.484488 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8c8f50f7ba68fd272f974a962b3d958bc797bb63b6619f9ead2e2ffc4525a32" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.484441 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-n52pp" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.577614 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-combined-ca-bundle\") pod \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.577705 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ct7f\" (UniqueName: \"kubernetes.io/projected/9fa244a8-7588-4d87-bd5b-cbcd10780c83-kube-api-access-2ct7f\") pod \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.577984 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-config-data\") pod \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\" (UID: \"9fa244a8-7588-4d87-bd5b-cbcd10780c83\") " Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.594269 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa244a8-7588-4d87-bd5b-cbcd10780c83-kube-api-access-2ct7f" (OuterVolumeSpecName: "kube-api-access-2ct7f") pod "9fa244a8-7588-4d87-bd5b-cbcd10780c83" (UID: "9fa244a8-7588-4d87-bd5b-cbcd10780c83"). InnerVolumeSpecName "kube-api-access-2ct7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.600859 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.601594 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.678699 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fa244a8-7588-4d87-bd5b-cbcd10780c83" (UID: "9fa244a8-7588-4d87-bd5b-cbcd10780c83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.688483 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.688866 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ct7f\" (UniqueName: \"kubernetes.io/projected/9fa244a8-7588-4d87-bd5b-cbcd10780c83-kube-api-access-2ct7f\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.773876 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-config-data" (OuterVolumeSpecName: "config-data") pod "9fa244a8-7588-4d87-bd5b-cbcd10780c83" (UID: "9fa244a8-7588-4d87-bd5b-cbcd10780c83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:26 crc kubenswrapper[4721]: I0202 13:24:26.797952 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fa244a8-7588-4d87-bd5b-cbcd10780c83-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:27 crc kubenswrapper[4721]: I0202 13:24:27.066284 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57c58bbb98-gpbp2"] Feb 02 13:24:27 crc kubenswrapper[4721]: I0202 13:24:27.506161 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" event={"ID":"755b5957-fcfa-486a-8e63-d562742d6650","Type":"ContainerStarted","Data":"c0b11b4c5c6c5c0be5728856fe317c744135fa5d6f051bf89ae0e65d2022026b"} Feb 02 13:24:27 crc kubenswrapper[4721]: I0202 13:24:27.509946 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d4595f9f9-4d2g5" event={"ID":"93a7211b-9a15-4765-99e2-520bd1d62ff1","Type":"ContainerStarted","Data":"ae92c461b5387d32c6a43c57e5ab0b32b352f0575d859542465440d2aee9a089"} Feb 02 13:24:27 crc kubenswrapper[4721]: I0202 13:24:27.512780 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57c58bbb98-gpbp2" event={"ID":"183927fe-ec27-461b-8284-3e71f5cb666a","Type":"ContainerStarted","Data":"033d1d8f9a9608dee54f5aa6ae922d818fb5bccfe3b340973427c50aa9ea1e38"} Feb 02 13:24:27 crc kubenswrapper[4721]: I0202 13:24:27.512836 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57c58bbb98-gpbp2" event={"ID":"183927fe-ec27-461b-8284-3e71f5cb666a","Type":"ContainerStarted","Data":"eb604f48b8181a624c73b1e2c40cb06a20997da36f4a8dffb4a69604e13e0240"} Feb 02 13:24:27 crc kubenswrapper[4721]: I0202 13:24:27.690715 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b2blk" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="registry-server" probeResult="failure" output=< Feb 02 13:24:27 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:24:27 crc kubenswrapper[4721]: > Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.035642 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.150602 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-db-sync-config-data\") pod \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.150735 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3578ef-5d1b-4c52-939c-237feadc1c5c-etc-machine-id\") pod \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.150759 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-scripts\") pod \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.150857 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-combined-ca-bundle\") pod \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.150902 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-config-data\") pod \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.151036 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4b6x\" (UniqueName: \"kubernetes.io/projected/ad3578ef-5d1b-4c52-939c-237feadc1c5c-kube-api-access-c4b6x\") pod \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\" (UID: \"ad3578ef-5d1b-4c52-939c-237feadc1c5c\") " Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.151648 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad3578ef-5d1b-4c52-939c-237feadc1c5c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ad3578ef-5d1b-4c52-939c-237feadc1c5c" (UID: "ad3578ef-5d1b-4c52-939c-237feadc1c5c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.159859 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ad3578ef-5d1b-4c52-939c-237feadc1c5c" (UID: "ad3578ef-5d1b-4c52-939c-237feadc1c5c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.165204 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3578ef-5d1b-4c52-939c-237feadc1c5c-kube-api-access-c4b6x" (OuterVolumeSpecName: "kube-api-access-c4b6x") pod "ad3578ef-5d1b-4c52-939c-237feadc1c5c" (UID: "ad3578ef-5d1b-4c52-939c-237feadc1c5c"). InnerVolumeSpecName "kube-api-access-c4b6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.171058 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-scripts" (OuterVolumeSpecName: "scripts") pod "ad3578ef-5d1b-4c52-939c-237feadc1c5c" (UID: "ad3578ef-5d1b-4c52-939c-237feadc1c5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.197206 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad3578ef-5d1b-4c52-939c-237feadc1c5c" (UID: "ad3578ef-5d1b-4c52-939c-237feadc1c5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.226270 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-config-data" (OuterVolumeSpecName: "config-data") pod "ad3578ef-5d1b-4c52-939c-237feadc1c5c" (UID: "ad3578ef-5d1b-4c52-939c-237feadc1c5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.254668 4721 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.254721 4721 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad3578ef-5d1b-4c52-939c-237feadc1c5c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.254731 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.254740 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.254750 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3578ef-5d1b-4c52-939c-237feadc1c5c-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.254758 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4b6x\" (UniqueName: \"kubernetes.io/projected/ad3578ef-5d1b-4c52-939c-237feadc1c5c-kube-api-access-c4b6x\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.531166 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57c58bbb98-gpbp2" event={"ID":"183927fe-ec27-461b-8284-3e71f5cb666a","Type":"ContainerStarted","Data":"2f3c5eefde9602b1bed1df089a10bcc66e2a0b0c453b9fb2dba2991fd6188739"} Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.533032 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.533215 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.534580 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d4595f9f9-4d2g5" event={"ID":"93a7211b-9a15-4765-99e2-520bd1d62ff1","Type":"ContainerStarted","Data":"37b5a3e1dc48437eb0809f4f006bad5aafb8f0c93896c78d9e286c2a4981f720"} Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.551383 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" event={"ID":"755b5957-fcfa-486a-8e63-d562742d6650","Type":"ContainerStarted","Data":"77fe2816980c07cff5664313909daa4e5208b233a25d31fe97f870e856ead996"} Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.565971 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-7wjxh" event={"ID":"ad3578ef-5d1b-4c52-939c-237feadc1c5c","Type":"ContainerDied","Data":"85181cad5536c1551bbbb525ba3a357b7e769080b51a0a4639fbdd8c37e6d7bd"} Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.566009 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85181cad5536c1551bbbb525ba3a357b7e769080b51a0a4639fbdd8c37e6d7bd" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.566078 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-7wjxh" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.568324 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57c58bbb98-gpbp2" podStartSLOduration=4.568299869 podStartE2EDuration="4.568299869s" podCreationTimestamp="2026-02-02 13:24:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:28.561180746 +0000 UTC m=+1408.863695135" watchObservedRunningTime="2026-02-02 13:24:28.568299869 +0000 UTC m=+1408.870814258" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.578834 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerStarted","Data":"0b3cb07d111bcefe4cce6e8cdec0586b864539201a7ae3eef461e7791d1c123a"} Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.578888 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerStarted","Data":"e7a1435ecc8600326126b2f72cf42852a2ff26324fb369f144ca54a8e7a02fe8"} Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.594549 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6f4497866b-px6fz" podStartSLOduration=4.72115542 podStartE2EDuration="10.594530041s" podCreationTimestamp="2026-02-02 13:24:18 +0000 UTC" firstStartedPulling="2026-02-02 13:24:20.660564284 +0000 UTC m=+1400.963078673" lastFinishedPulling="2026-02-02 13:24:26.533938905 +0000 UTC m=+1406.836453294" observedRunningTime="2026-02-02 13:24:28.591435967 +0000 UTC m=+1408.893950356" watchObservedRunningTime="2026-02-02 13:24:28.594530041 +0000 UTC m=+1408.897044430" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.640872 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-d4595f9f9-4d2g5" podStartSLOduration=4.450291863 podStartE2EDuration="10.640847607s" podCreationTimestamp="2026-02-02 13:24:18 +0000 UTC" firstStartedPulling="2026-02-02 13:24:20.343144815 +0000 UTC m=+1400.645659214" lastFinishedPulling="2026-02-02 13:24:26.533700569 +0000 UTC m=+1406.836214958" observedRunningTime="2026-02-02 13:24:28.618892312 +0000 UTC m=+1408.921406711" watchObservedRunningTime="2026-02-02 13:24:28.640847607 +0000 UTC m=+1408.943362016" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.855000 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 13:24:28 crc kubenswrapper[4721]: E0202 13:24:28.855734 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa244a8-7588-4d87-bd5b-cbcd10780c83" containerName="heat-db-sync" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.855806 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa244a8-7588-4d87-bd5b-cbcd10780c83" containerName="heat-db-sync" Feb 02 13:24:28 crc kubenswrapper[4721]: E0202 13:24:28.855909 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3578ef-5d1b-4c52-939c-237feadc1c5c" containerName="cinder-db-sync" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.855966 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3578ef-5d1b-4c52-939c-237feadc1c5c" containerName="cinder-db-sync" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.856305 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3578ef-5d1b-4c52-939c-237feadc1c5c" containerName="cinder-db-sync" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.856405 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa244a8-7588-4d87-bd5b-cbcd10780c83" containerName="heat-db-sync" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.859594 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.870994 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.871371 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v76dv" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.871588 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.877259 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.896703 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.982919 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.983230 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5dce367e-6a22-454b-bd02-4a69a739af22-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.983415 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8v7w\" (UniqueName: \"kubernetes.io/projected/5dce367e-6a22-454b-bd02-4a69a739af22-kube-api-access-r8v7w\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.983556 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-scripts\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.983706 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.983869 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.984685 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjf7x"] Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.985177 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" podUID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" containerName="dnsmasq-dns" containerID="cri-o://7517327cffd1477bc363977d8634c4cb01f4fc58a61eaaad4127a09cf6a0605e" gracePeriod=10 Feb 02 13:24:28 crc kubenswrapper[4721]: I0202 13:24:28.992703 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.054150 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9czmj"] Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.075145 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.087151 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.087369 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.087406 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5dce367e-6a22-454b-bd02-4a69a739af22-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.087429 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8v7w\" (UniqueName: \"kubernetes.io/projected/5dce367e-6a22-454b-bd02-4a69a739af22-kube-api-access-r8v7w\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.087458 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-scripts\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.087547 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.088027 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5dce367e-6a22-454b-bd02-4a69a739af22-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.095742 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.112292 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.116647 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.139045 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8v7w\" (UniqueName: \"kubernetes.io/projected/5dce367e-6a22-454b-bd02-4a69a739af22-kube-api-access-r8v7w\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.205269 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.205482 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kpkt\" (UniqueName: \"kubernetes.io/projected/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-kube-api-access-5kpkt\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.205623 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.205663 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-config\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.205711 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.205884 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.228443 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-scripts\") pod \"cinder-scheduler-0\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.324406 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.324530 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kpkt\" (UniqueName: \"kubernetes.io/projected/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-kube-api-access-5kpkt\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.324616 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.324642 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-config\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.324673 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.324758 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.332442 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.333027 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.337833 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.339867 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.352136 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9czmj"] Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.359341 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-config\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.363533 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kpkt\" (UniqueName: \"kubernetes.io/projected/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-kube-api-access-5kpkt\") pod \"dnsmasq-dns-5c9776ccc5-9czmj\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.446416 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.449230 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.504008 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.507664 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.511305 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.545449 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data-custom\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.545601 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.545705 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6048763-9be8-4530-b02a-78022c20d668-logs\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.545736 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-scripts\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.553451 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8k67\" (UniqueName: \"kubernetes.io/projected/a6048763-9be8-4530-b02a-78022c20d668-kube-api-access-m8k67\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.553627 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.553849 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6048763-9be8-4530-b02a-78022c20d668-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.578488 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.630374 4721 generic.go:334] "Generic (PLEG): container finished" podID="92544741-12fa-42ac-ba5b-67179ec9443b" containerID="da489fd3d8eceb37f5d84a7cb93a4298ee1e4fc025f5a8e61c61624f47e5cc8a" exitCode=0 Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.630475 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d9f7977f-7dt9k" event={"ID":"92544741-12fa-42ac-ba5b-67179ec9443b","Type":"ContainerDied","Data":"da489fd3d8eceb37f5d84a7cb93a4298ee1e4fc025f5a8e61c61624f47e5cc8a"} Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.655673 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data-custom\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.655813 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.655889 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6048763-9be8-4530-b02a-78022c20d668-logs\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.655955 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-scripts\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.655973 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8k67\" (UniqueName: \"kubernetes.io/projected/a6048763-9be8-4530-b02a-78022c20d668-kube-api-access-m8k67\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.656026 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.656119 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6048763-9be8-4530-b02a-78022c20d668-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.667920 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6048763-9be8-4530-b02a-78022c20d668-logs\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.684758 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data-custom\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.693591 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-scripts\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.706734 4721 generic.go:334] "Generic (PLEG): container finished" podID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" containerID="7517327cffd1477bc363977d8634c4cb01f4fc58a61eaaad4127a09cf6a0605e" exitCode=0 Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.706939 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" event={"ID":"d2fa85eb-972d-4369-8103-dd4cd3e2b78a","Type":"ContainerDied","Data":"7517327cffd1477bc363977d8634c4cb01f4fc58a61eaaad4127a09cf6a0605e"} Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.720062 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.722714 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8k67\" (UniqueName: \"kubernetes.io/projected/a6048763-9be8-4530-b02a-78022c20d668-kube-api-access-m8k67\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.731371 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.733286 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6048763-9be8-4530-b02a-78022c20d668-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " pod="openstack/cinder-api-0" Feb 02 13:24:29 crc kubenswrapper[4721]: I0202 13:24:29.986523 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.117663 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.194106 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-sb\") pod \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.194179 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-svc\") pod \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.194217 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p7rn\" (UniqueName: \"kubernetes.io/projected/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-kube-api-access-5p7rn\") pod \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.194392 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-config\") pod \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.194487 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-swift-storage-0\") pod \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.194575 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-nb\") pod \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\" (UID: \"d2fa85eb-972d-4369-8103-dd4cd3e2b78a\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.238331 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-kube-api-access-5p7rn" (OuterVolumeSpecName: "kube-api-access-5p7rn") pod "d2fa85eb-972d-4369-8103-dd4cd3e2b78a" (UID: "d2fa85eb-972d-4369-8103-dd4cd3e2b78a"). InnerVolumeSpecName "kube-api-access-5p7rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.300621 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p7rn\" (UniqueName: \"kubernetes.io/projected/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-kube-api-access-5p7rn\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.326762 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-config" (OuterVolumeSpecName: "config") pod "d2fa85eb-972d-4369-8103-dd4cd3e2b78a" (UID: "d2fa85eb-972d-4369-8103-dd4cd3e2b78a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.330592 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d2fa85eb-972d-4369-8103-dd4cd3e2b78a" (UID: "d2fa85eb-972d-4369-8103-dd4cd3e2b78a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.368056 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d2fa85eb-972d-4369-8103-dd4cd3e2b78a" (UID: "d2fa85eb-972d-4369-8103-dd4cd3e2b78a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.372738 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d2fa85eb-972d-4369-8103-dd4cd3e2b78a" (UID: "d2fa85eb-972d-4369-8103-dd4cd3e2b78a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.403679 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.403708 4721 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.403722 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.403731 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.448165 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d2fa85eb-972d-4369-8103-dd4cd3e2b78a" (UID: "d2fa85eb-972d-4369-8103-dd4cd3e2b78a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.506909 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2fa85eb-972d-4369-8103-dd4cd3e2b78a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.713539 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.757121 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d9f7977f-7dt9k" event={"ID":"92544741-12fa-42ac-ba5b-67179ec9443b","Type":"ContainerDied","Data":"f6156b432d02a8811d9c46ced7a52980aa20788c34415c25f1c59b54dac366c9"} Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.757201 4721 scope.go:117] "RemoveContainer" containerID="8b6cdf8693a8045891752ca1818e301a94afd2ff81089e5e3f78d2e62be53c12" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.757426 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d9f7977f-7dt9k" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.776816 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.776834 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" event={"ID":"d2fa85eb-972d-4369-8103-dd4cd3e2b78a","Type":"ContainerDied","Data":"99da6e129c53d9990311d147953c7a4dfacacce031a5f1e7ffc090745f296f41"} Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.788889 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerStarted","Data":"4d53de923a6215191fa0ab435e823d19598bea6ec82ecee59ac252608725c58c"} Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.817221 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-ovndb-tls-certs\") pod \"92544741-12fa-42ac-ba5b-67179ec9443b\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.817369 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-httpd-config\") pod \"92544741-12fa-42ac-ba5b-67179ec9443b\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.817456 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zxkh\" (UniqueName: \"kubernetes.io/projected/92544741-12fa-42ac-ba5b-67179ec9443b-kube-api-access-9zxkh\") pod \"92544741-12fa-42ac-ba5b-67179ec9443b\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.817604 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-combined-ca-bundle\") pod \"92544741-12fa-42ac-ba5b-67179ec9443b\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.817716 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-config\") pod \"92544741-12fa-42ac-ba5b-67179ec9443b\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.817848 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-internal-tls-certs\") pod \"92544741-12fa-42ac-ba5b-67179ec9443b\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.817926 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-public-tls-certs\") pod \"92544741-12fa-42ac-ba5b-67179ec9443b\" (UID: \"92544741-12fa-42ac-ba5b-67179ec9443b\") " Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.829328 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92544741-12fa-42ac-ba5b-67179ec9443b-kube-api-access-9zxkh" (OuterVolumeSpecName: "kube-api-access-9zxkh") pod "92544741-12fa-42ac-ba5b-67179ec9443b" (UID: "92544741-12fa-42ac-ba5b-67179ec9443b"). InnerVolumeSpecName "kube-api-access-9zxkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.835476 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjf7x"] Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.858947 4721 scope.go:117] "RemoveContainer" containerID="da489fd3d8eceb37f5d84a7cb93a4298ee1e4fc025f5a8e61c61624f47e5cc8a" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.871394 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "92544741-12fa-42ac-ba5b-67179ec9443b" (UID: "92544741-12fa-42ac-ba5b-67179ec9443b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.874714 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjf7x"] Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.924605 4721 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.924657 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zxkh\" (UniqueName: \"kubernetes.io/projected/92544741-12fa-42ac-ba5b-67179ec9443b-kube-api-access-9zxkh\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:30 crc kubenswrapper[4721]: I0202 13:24:30.996502 4721 scope.go:117] "RemoveContainer" containerID="7517327cffd1477bc363977d8634c4cb01f4fc58a61eaaad4127a09cf6a0605e" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.022396 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "92544741-12fa-42ac-ba5b-67179ec9443b" (UID: "92544741-12fa-42ac-ba5b-67179ec9443b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.026643 4721 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.036571 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.060220 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92544741-12fa-42ac-ba5b-67179ec9443b" (UID: "92544741-12fa-42ac-ba5b-67179ec9443b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.083611 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9czmj"] Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.103385 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "92544741-12fa-42ac-ba5b-67179ec9443b" (UID: "92544741-12fa-42ac-ba5b-67179ec9443b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.113967 4721 scope.go:117] "RemoveContainer" containerID="67e81c0f84720634dc560e673791d8be5a3f4506fb68a6b94459043bb3aa0bc2" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.118046 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-config" (OuterVolumeSpecName: "config") pod "92544741-12fa-42ac-ba5b-67179ec9443b" (UID: "92544741-12fa-42ac-ba5b-67179ec9443b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.130675 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.130710 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.130724 4721 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.148147 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "92544741-12fa-42ac-ba5b-67179ec9443b" (UID: "92544741-12fa-42ac-ba5b-67179ec9443b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.232895 4721 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/92544741-12fa-42ac-ba5b-67179ec9443b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.276509 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.667142 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-59d9f7977f-7dt9k"] Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.768652 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-59d9f7977f-7dt9k"] Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.837672 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" event={"ID":"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f","Type":"ContainerStarted","Data":"0e5ec1ece5d46afcb7f1cd43125f924d0c21a100ba9e55bfc9c2436d4917aef7"} Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.866296 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5dce367e-6a22-454b-bd02-4a69a739af22","Type":"ContainerStarted","Data":"308b644f1394a2cc3225f510fa06697b0a8ae1e9f8e2aa6c15bae4c005148f01"} Feb 02 13:24:31 crc kubenswrapper[4721]: I0202 13:24:31.883013 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a6048763-9be8-4530-b02a-78022c20d668","Type":"ContainerStarted","Data":"722b50e74b8986a87887949a03c96fc3de9ed0d41b61df9c48e69c902703c27b"} Feb 02 13:24:32 crc kubenswrapper[4721]: I0202 13:24:32.444254 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" path="/var/lib/kubelet/pods/92544741-12fa-42ac-ba5b-67179ec9443b/volumes" Feb 02 13:24:32 crc kubenswrapper[4721]: I0202 13:24:32.445196 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" path="/var/lib/kubelet/pods/d2fa85eb-972d-4369-8103-dd4cd3e2b78a/volumes" Feb 02 13:24:32 crc kubenswrapper[4721]: I0202 13:24:32.719043 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 13:24:32 crc kubenswrapper[4721]: I0202 13:24:32.902346 4721 generic.go:334] "Generic (PLEG): container finished" podID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" containerID="4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505" exitCode=0 Feb 02 13:24:32 crc kubenswrapper[4721]: I0202 13:24:32.902405 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" event={"ID":"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f","Type":"ContainerDied","Data":"4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505"} Feb 02 13:24:33 crc kubenswrapper[4721]: I0202 13:24:33.855182 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-55dd659f54-28qsl" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.207:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 13:24:33 crc kubenswrapper[4721]: I0202 13:24:33.858776 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-55dd659f54-28qsl" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.207:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 13:24:33 crc kubenswrapper[4721]: I0202 13:24:33.866416 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:33 crc kubenswrapper[4721]: I0202 13:24:33.972145 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerStarted","Data":"f6ced5c029fbb0ced62ee14010b08872173b1b7d00f61d4cb14f85035de775b1"} Feb 02 13:24:33 crc kubenswrapper[4721]: I0202 13:24:33.973460 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 13:24:33 crc kubenswrapper[4721]: I0202 13:24:33.976526 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" event={"ID":"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f","Type":"ContainerStarted","Data":"79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3"} Feb 02 13:24:33 crc kubenswrapper[4721]: I0202 13:24:33.977376 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:33 crc kubenswrapper[4721]: I0202 13:24:33.979147 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5dce367e-6a22-454b-bd02-4a69a739af22","Type":"ContainerStarted","Data":"d6fc3e691cab5185ee5e5e86d01fdac6b5e28431f28e13143ff03ccfe61af6f1"} Feb 02 13:24:33 crc kubenswrapper[4721]: I0202 13:24:33.980593 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a6048763-9be8-4530-b02a-78022c20d668","Type":"ContainerStarted","Data":"e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681"} Feb 02 13:24:34 crc kubenswrapper[4721]: I0202 13:24:34.015455 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.901810154 podStartE2EDuration="12.015436621s" podCreationTimestamp="2026-02-02 13:24:22 +0000 UTC" firstStartedPulling="2026-02-02 13:24:26.307430223 +0000 UTC m=+1406.609944612" lastFinishedPulling="2026-02-02 13:24:32.42105668 +0000 UTC m=+1412.723571079" observedRunningTime="2026-02-02 13:24:33.994921264 +0000 UTC m=+1414.297435653" watchObservedRunningTime="2026-02-02 13:24:34.015436621 +0000 UTC m=+1414.317951010" Feb 02 13:24:34 crc kubenswrapper[4721]: I0202 13:24:34.026012 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" podStartSLOduration=6.025994617 podStartE2EDuration="6.025994617s" podCreationTimestamp="2026-02-02 13:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:34.022097612 +0000 UTC m=+1414.324612011" watchObservedRunningTime="2026-02-02 13:24:34.025994617 +0000 UTC m=+1414.328508996" Feb 02 13:24:34 crc kubenswrapper[4721]: I0202 13:24:34.603921 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:34 crc kubenswrapper[4721]: I0202 13:24:34.693907 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-85ff748b95-xjf7x" podUID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.206:5353: i/o timeout" Feb 02 13:24:35 crc kubenswrapper[4721]: I0202 13:24:35.049385 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5dce367e-6a22-454b-bd02-4a69a739af22","Type":"ContainerStarted","Data":"f2ec4bfdada8e2932f35d145116d0bf335c4ba0ff8527d320662aec6daa98921"} Feb 02 13:24:35 crc kubenswrapper[4721]: I0202 13:24:35.071327 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a6048763-9be8-4530-b02a-78022c20d668","Type":"ContainerStarted","Data":"014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0"} Feb 02 13:24:35 crc kubenswrapper[4721]: I0202 13:24:35.071374 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a6048763-9be8-4530-b02a-78022c20d668" containerName="cinder-api-log" containerID="cri-o://e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681" gracePeriod=30 Feb 02 13:24:35 crc kubenswrapper[4721]: I0202 13:24:35.071508 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a6048763-9be8-4530-b02a-78022c20d668" containerName="cinder-api" containerID="cri-o://014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0" gracePeriod=30 Feb 02 13:24:35 crc kubenswrapper[4721]: I0202 13:24:35.071789 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 13:24:35 crc kubenswrapper[4721]: I0202 13:24:35.086225 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.068730596 podStartE2EDuration="7.086199771s" podCreationTimestamp="2026-02-02 13:24:28 +0000 UTC" firstStartedPulling="2026-02-02 13:24:31.041835724 +0000 UTC m=+1411.344350113" lastFinishedPulling="2026-02-02 13:24:32.059304899 +0000 UTC m=+1412.361819288" observedRunningTime="2026-02-02 13:24:35.075391577 +0000 UTC m=+1415.377905976" watchObservedRunningTime="2026-02-02 13:24:35.086199771 +0000 UTC m=+1415.388714160" Feb 02 13:24:35 crc kubenswrapper[4721]: I0202 13:24:35.109120 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.109097392 podStartE2EDuration="6.109097392s" podCreationTimestamp="2026-02-02 13:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:35.099506471 +0000 UTC m=+1415.402020860" watchObservedRunningTime="2026-02-02 13:24:35.109097392 +0000 UTC m=+1415.411611821" Feb 02 13:24:35 crc kubenswrapper[4721]: I0202 13:24:35.122576 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlbxn" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" probeResult="failure" output=< Feb 02 13:24:35 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:24:35 crc kubenswrapper[4721]: > Feb 02 13:24:36 crc kubenswrapper[4721]: I0202 13:24:36.118005 4721 generic.go:334] "Generic (PLEG): container finished" podID="a6048763-9be8-4530-b02a-78022c20d668" containerID="e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681" exitCode=143 Feb 02 13:24:36 crc kubenswrapper[4721]: I0202 13:24:36.118218 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a6048763-9be8-4530-b02a-78022c20d668","Type":"ContainerDied","Data":"e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681"} Feb 02 13:24:37 crc kubenswrapper[4721]: I0202 13:24:37.676838 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-b2blk" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="registry-server" probeResult="failure" output=< Feb 02 13:24:37 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:24:37 crc kubenswrapper[4721]: > Feb 02 13:24:37 crc kubenswrapper[4721]: I0202 13:24:37.984756 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:38 crc kubenswrapper[4721]: I0202 13:24:38.461910 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-784866f846-pjz9x" Feb 02 13:24:38 crc kubenswrapper[4721]: I0202 13:24:38.646994 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57c58bbb98-gpbp2" Feb 02 13:24:38 crc kubenswrapper[4721]: I0202 13:24:38.730431 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55dd659f54-28qsl"] Feb 02 13:24:38 crc kubenswrapper[4721]: I0202 13:24:38.730731 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55dd659f54-28qsl" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api-log" containerID="cri-o://a307dcbe4bd53270347a396194e4ad6fcdab051a06bc6fbbccd4da1332e2bfc7" gracePeriod=30 Feb 02 13:24:38 crc kubenswrapper[4721]: I0202 13:24:38.730889 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55dd659f54-28qsl" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api" containerID="cri-o://0d5c49970e3de4e3928ae616ad70c665a088d4d4722ff49627ed715494d6560a" gracePeriod=30 Feb 02 13:24:39 crc kubenswrapper[4721]: I0202 13:24:39.180864 4721 generic.go:334] "Generic (PLEG): container finished" podID="0f119900-0b52-425a-be0a-0940a4747f89" containerID="a307dcbe4bd53270347a396194e4ad6fcdab051a06bc6fbbccd4da1332e2bfc7" exitCode=143 Feb 02 13:24:39 crc kubenswrapper[4721]: I0202 13:24:39.181249 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dd659f54-28qsl" event={"ID":"0f119900-0b52-425a-be0a-0940a4747f89","Type":"ContainerDied","Data":"a307dcbe4bd53270347a396194e4ad6fcdab051a06bc6fbbccd4da1332e2bfc7"} Feb 02 13:24:39 crc kubenswrapper[4721]: I0202 13:24:39.512282 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 13:24:39 crc kubenswrapper[4721]: I0202 13:24:39.581235 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:24:39 crc kubenswrapper[4721]: I0202 13:24:39.682449 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnq9d"] Feb 02 13:24:39 crc kubenswrapper[4721]: I0202 13:24:39.682685 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" podUID="4b323e62-7a54-4935-8e47-2df809ecb2f9" containerName="dnsmasq-dns" containerID="cri-o://38beb737c04426dba147dffdfeaee7f338c6927e4b1cdcf880429b179ca9988b" gracePeriod=10 Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.056706 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.193078 4721 generic.go:334] "Generic (PLEG): container finished" podID="4b323e62-7a54-4935-8e47-2df809ecb2f9" containerID="38beb737c04426dba147dffdfeaee7f338c6927e4b1cdcf880429b179ca9988b" exitCode=0 Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.193417 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" event={"ID":"4b323e62-7a54-4935-8e47-2df809ecb2f9","Type":"ContainerDied","Data":"38beb737c04426dba147dffdfeaee7f338c6927e4b1cdcf880429b179ca9988b"} Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.298648 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.631625 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.727368 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 13:24:40 crc kubenswrapper[4721]: E0202 13:24:40.728258 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b323e62-7a54-4935-8e47-2df809ecb2f9" containerName="dnsmasq-dns" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.728304 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b323e62-7a54-4935-8e47-2df809ecb2f9" containerName="dnsmasq-dns" Feb 02 13:24:40 crc kubenswrapper[4721]: E0202 13:24:40.728318 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b323e62-7a54-4935-8e47-2df809ecb2f9" containerName="init" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.728328 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b323e62-7a54-4935-8e47-2df809ecb2f9" containerName="init" Feb 02 13:24:40 crc kubenswrapper[4721]: E0202 13:24:40.728356 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" containerName="dnsmasq-dns" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.728365 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" containerName="dnsmasq-dns" Feb 02 13:24:40 crc kubenswrapper[4721]: E0202 13:24:40.728383 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-httpd" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.728390 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-httpd" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.728391 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-swift-storage-0\") pod \"4b323e62-7a54-4935-8e47-2df809ecb2f9\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.728493 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-svc\") pod \"4b323e62-7a54-4935-8e47-2df809ecb2f9\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " Feb 02 13:24:40 crc kubenswrapper[4721]: E0202 13:24:40.728409 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-api" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.728542 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-api" Feb 02 13:24:40 crc kubenswrapper[4721]: E0202 13:24:40.728621 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" containerName="init" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.728631 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" containerName="init" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.729205 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-api" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.729251 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2fa85eb-972d-4369-8103-dd4cd3e2b78a" containerName="dnsmasq-dns" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.729285 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b323e62-7a54-4935-8e47-2df809ecb2f9" containerName="dnsmasq-dns" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.729297 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="92544741-12fa-42ac-ba5b-67179ec9443b" containerName="neutron-httpd" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.730402 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.731592 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-sb\") pod \"4b323e62-7a54-4935-8e47-2df809ecb2f9\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.731666 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-config\") pod \"4b323e62-7a54-4935-8e47-2df809ecb2f9\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.731753 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-nb\") pod \"4b323e62-7a54-4935-8e47-2df809ecb2f9\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.731830 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcdcl\" (UniqueName: \"kubernetes.io/projected/4b323e62-7a54-4935-8e47-2df809ecb2f9-kube-api-access-gcdcl\") pod \"4b323e62-7a54-4935-8e47-2df809ecb2f9\" (UID: \"4b323e62-7a54-4935-8e47-2df809ecb2f9\") " Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.739772 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.741221 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hl2f5" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.741413 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.801058 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.813284 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b323e62-7a54-4935-8e47-2df809ecb2f9-kube-api-access-gcdcl" (OuterVolumeSpecName: "kube-api-access-gcdcl") pod "4b323e62-7a54-4935-8e47-2df809ecb2f9" (UID: "4b323e62-7a54-4935-8e47-2df809ecb2f9"). InnerVolumeSpecName "kube-api-access-gcdcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.835291 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.835361 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.835425 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config-secret\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.835469 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j4zs\" (UniqueName: \"kubernetes.io/projected/7588b1fc-b958-40e3-bec2-abc209c1a802-kube-api-access-9j4zs\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.835877 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b323e62-7a54-4935-8e47-2df809ecb2f9" (UID: "4b323e62-7a54-4935-8e47-2df809ecb2f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.836088 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcdcl\" (UniqueName: \"kubernetes.io/projected/4b323e62-7a54-4935-8e47-2df809ecb2f9-kube-api-access-gcdcl\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.841742 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b323e62-7a54-4935-8e47-2df809ecb2f9" (UID: "4b323e62-7a54-4935-8e47-2df809ecb2f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.855883 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-config" (OuterVolumeSpecName: "config") pod "4b323e62-7a54-4935-8e47-2df809ecb2f9" (UID: "4b323e62-7a54-4935-8e47-2df809ecb2f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.859421 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4b323e62-7a54-4935-8e47-2df809ecb2f9" (UID: "4b323e62-7a54-4935-8e47-2df809ecb2f9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.893234 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b323e62-7a54-4935-8e47-2df809ecb2f9" (UID: "4b323e62-7a54-4935-8e47-2df809ecb2f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.937887 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j4zs\" (UniqueName: \"kubernetes.io/projected/7588b1fc-b958-40e3-bec2-abc209c1a802-kube-api-access-9j4zs\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.938122 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.938163 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.938226 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config-secret\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.938290 4721 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.938302 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.938312 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.938321 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.938470 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b323e62-7a54-4935-8e47-2df809ecb2f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.939213 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.942227 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config-secret\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.942971 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:40 crc kubenswrapper[4721]: I0202 13:24:40.984041 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j4zs\" (UniqueName: \"kubernetes.io/projected/7588b1fc-b958-40e3-bec2-abc209c1a802-kube-api-access-9j4zs\") pod \"openstackclient\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.041099 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.042205 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.074473 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.138322 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.140371 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.190224 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.263656 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5dce367e-6a22-454b-bd02-4a69a739af22" containerName="cinder-scheduler" containerID="cri-o://d6fc3e691cab5185ee5e5e86d01fdac6b5e28431f28e13143ff03ccfe61af6f1" gracePeriod=30 Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.264039 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5dce367e-6a22-454b-bd02-4a69a739af22" containerName="probe" containerID="cri-o://f2ec4bfdada8e2932f35d145116d0bf335c4ba0ff8527d320662aec6daa98921" gracePeriod=30 Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.264160 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.264184 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mnq9d" event={"ID":"4b323e62-7a54-4935-8e47-2df809ecb2f9","Type":"ContainerDied","Data":"2496bbb6bf1b14291e279777fda509f8bd82ead08412e32b9ec7b5156bb292a2"} Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.264920 4721 scope.go:117] "RemoveContainer" containerID="38beb737c04426dba147dffdfeaee7f338c6927e4b1cdcf880429b179ca9988b" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.329958 4721 scope.go:117] "RemoveContainer" containerID="ece0ac3461ccc45508567010797bf0f995740c80a06e3f1018effd1949e58be1" Feb 02 13:24:41 crc kubenswrapper[4721]: E0202 13:24:41.335442 4721 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 02 13:24:41 crc kubenswrapper[4721]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_7588b1fc-b958-40e3-bec2-abc209c1a802_0(ef60e78374690088d8e634fbbdd05b55f2d30a3b90c095b2db8f72cec0e0c2a5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ef60e78374690088d8e634fbbdd05b55f2d30a3b90c095b2db8f72cec0e0c2a5" Netns:"/var/run/netns/6b2c382b-0ca6-4b76-a643-497fa6d23186" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ef60e78374690088d8e634fbbdd05b55f2d30a3b90c095b2db8f72cec0e0c2a5;K8S_POD_UID=7588b1fc-b958-40e3-bec2-abc209c1a802" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/7588b1fc-b958-40e3-bec2-abc209c1a802]: expected pod UID "7588b1fc-b958-40e3-bec2-abc209c1a802" but got "32729b18-a175-4abd-a8cf-392d318b64d8" from Kube API Feb 02 13:24:41 crc kubenswrapper[4721]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 02 13:24:41 crc kubenswrapper[4721]: > Feb 02 13:24:41 crc kubenswrapper[4721]: E0202 13:24:41.335492 4721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 02 13:24:41 crc kubenswrapper[4721]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_7588b1fc-b958-40e3-bec2-abc209c1a802_0(ef60e78374690088d8e634fbbdd05b55f2d30a3b90c095b2db8f72cec0e0c2a5): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ef60e78374690088d8e634fbbdd05b55f2d30a3b90c095b2db8f72cec0e0c2a5" Netns:"/var/run/netns/6b2c382b-0ca6-4b76-a643-497fa6d23186" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ef60e78374690088d8e634fbbdd05b55f2d30a3b90c095b2db8f72cec0e0c2a5;K8S_POD_UID=7588b1fc-b958-40e3-bec2-abc209c1a802" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/7588b1fc-b958-40e3-bec2-abc209c1a802]: expected pod UID "7588b1fc-b958-40e3-bec2-abc209c1a802" but got "32729b18-a175-4abd-a8cf-392d318b64d8" from Kube API Feb 02 13:24:41 crc kubenswrapper[4721]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 02 13:24:41 crc kubenswrapper[4721]: > pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.351749 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32729b18-a175-4abd-a8cf-392d318b64d8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.352401 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32729b18-a175-4abd-a8cf-392d318b64d8-openstack-config\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.352577 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32729b18-a175-4abd-a8cf-392d318b64d8-openstack-config-secret\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.352917 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kbsn\" (UniqueName: \"kubernetes.io/projected/32729b18-a175-4abd-a8cf-392d318b64d8-kube-api-access-2kbsn\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.423257 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnq9d"] Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.440404 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mnq9d"] Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.454828 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32729b18-a175-4abd-a8cf-392d318b64d8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.454899 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32729b18-a175-4abd-a8cf-392d318b64d8-openstack-config\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.454967 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32729b18-a175-4abd-a8cf-392d318b64d8-openstack-config-secret\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.455039 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kbsn\" (UniqueName: \"kubernetes.io/projected/32729b18-a175-4abd-a8cf-392d318b64d8-kube-api-access-2kbsn\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.455955 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32729b18-a175-4abd-a8cf-392d318b64d8-openstack-config\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.464503 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32729b18-a175-4abd-a8cf-392d318b64d8-openstack-config-secret\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.464514 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32729b18-a175-4abd-a8cf-392d318b64d8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.473899 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kbsn\" (UniqueName: \"kubernetes.io/projected/32729b18-a175-4abd-a8cf-392d318b64d8-kube-api-access-2kbsn\") pod \"openstackclient\" (UID: \"32729b18-a175-4abd-a8cf-392d318b64d8\") " pod="openstack/openstackclient" Feb 02 13:24:41 crc kubenswrapper[4721]: I0202 13:24:41.555496 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.226497 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55dd659f54-28qsl" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.207:9311/healthcheck\": read tcp 10.217.0.2:36096->10.217.0.207:9311: read: connection reset by peer" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.226563 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55dd659f54-28qsl" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.207:9311/healthcheck\": read tcp 10.217.0.2:36084->10.217.0.207:9311: read: connection reset by peer" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.289104 4721 generic.go:334] "Generic (PLEG): container finished" podID="5dce367e-6a22-454b-bd02-4a69a739af22" containerID="f2ec4bfdada8e2932f35d145116d0bf335c4ba0ff8527d320662aec6daa98921" exitCode=0 Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.289147 4721 generic.go:334] "Generic (PLEG): container finished" podID="5dce367e-6a22-454b-bd02-4a69a739af22" containerID="d6fc3e691cab5185ee5e5e86d01fdac6b5e28431f28e13143ff03ccfe61af6f1" exitCode=0 Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.289200 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5dce367e-6a22-454b-bd02-4a69a739af22","Type":"ContainerDied","Data":"f2ec4bfdada8e2932f35d145116d0bf335c4ba0ff8527d320662aec6daa98921"} Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.289234 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5dce367e-6a22-454b-bd02-4a69a739af22","Type":"ContainerDied","Data":"d6fc3e691cab5185ee5e5e86d01fdac6b5e28431f28e13143ff03ccfe61af6f1"} Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.301736 4721 generic.go:334] "Generic (PLEG): container finished" podID="0f119900-0b52-425a-be0a-0940a4747f89" containerID="0d5c49970e3de4e3928ae616ad70c665a088d4d4722ff49627ed715494d6560a" exitCode=0 Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.301843 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.302155 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dd659f54-28qsl" event={"ID":"0f119900-0b52-425a-be0a-0940a4747f89","Type":"ContainerDied","Data":"0d5c49970e3de4e3928ae616ad70c665a088d4d4722ff49627ed715494d6560a"} Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.308294 4721 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7588b1fc-b958-40e3-bec2-abc209c1a802" podUID="32729b18-a175-4abd-a8cf-392d318b64d8" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.324348 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.441969 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b323e62-7a54-4935-8e47-2df809ecb2f9" path="/var/lib/kubelet/pods/4b323e62-7a54-4935-8e47-2df809ecb2f9/volumes" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.480725 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.601040 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config-secret\") pod \"7588b1fc-b958-40e3-bec2-abc209c1a802\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.601284 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config\") pod \"7588b1fc-b958-40e3-bec2-abc209c1a802\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.601313 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j4zs\" (UniqueName: \"kubernetes.io/projected/7588b1fc-b958-40e3-bec2-abc209c1a802-kube-api-access-9j4zs\") pod \"7588b1fc-b958-40e3-bec2-abc209c1a802\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.601385 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-combined-ca-bundle\") pod \"7588b1fc-b958-40e3-bec2-abc209c1a802\" (UID: \"7588b1fc-b958-40e3-bec2-abc209c1a802\") " Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.612586 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "7588b1fc-b958-40e3-bec2-abc209c1a802" (UID: "7588b1fc-b958-40e3-bec2-abc209c1a802"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.623315 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7588b1fc-b958-40e3-bec2-abc209c1a802" (UID: "7588b1fc-b958-40e3-bec2-abc209c1a802"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.648764 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7588b1fc-b958-40e3-bec2-abc209c1a802-kube-api-access-9j4zs" (OuterVolumeSpecName: "kube-api-access-9j4zs") pod "7588b1fc-b958-40e3-bec2-abc209c1a802" (UID: "7588b1fc-b958-40e3-bec2-abc209c1a802"). InnerVolumeSpecName "kube-api-access-9j4zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.650012 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "7588b1fc-b958-40e3-bec2-abc209c1a802" (UID: "7588b1fc-b958-40e3-bec2-abc209c1a802"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.733029 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.733096 4721 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.733113 4721 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7588b1fc-b958-40e3-bec2-abc209c1a802-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.733126 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j4zs\" (UniqueName: \"kubernetes.io/projected/7588b1fc-b958-40e3-bec2-abc209c1a802-kube-api-access-9j4zs\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.905883 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.941963 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7tv9\" (UniqueName: \"kubernetes.io/projected/0f119900-0b52-425a-be0a-0940a4747f89-kube-api-access-k7tv9\") pod \"0f119900-0b52-425a-be0a-0940a4747f89\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.942048 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data-custom\") pod \"0f119900-0b52-425a-be0a-0940a4747f89\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.942130 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data\") pod \"0f119900-0b52-425a-be0a-0940a4747f89\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.942214 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f119900-0b52-425a-be0a-0940a4747f89-logs\") pod \"0f119900-0b52-425a-be0a-0940a4747f89\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.942297 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-combined-ca-bundle\") pod \"0f119900-0b52-425a-be0a-0940a4747f89\" (UID: \"0f119900-0b52-425a-be0a-0940a4747f89\") " Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.945408 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f119900-0b52-425a-be0a-0940a4747f89-logs" (OuterVolumeSpecName: "logs") pod "0f119900-0b52-425a-be0a-0940a4747f89" (UID: "0f119900-0b52-425a-be0a-0940a4747f89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.959886 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f119900-0b52-425a-be0a-0940a4747f89-kube-api-access-k7tv9" (OuterVolumeSpecName: "kube-api-access-k7tv9") pod "0f119900-0b52-425a-be0a-0940a4747f89" (UID: "0f119900-0b52-425a-be0a-0940a4747f89"). InnerVolumeSpecName "kube-api-access-k7tv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.960635 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f119900-0b52-425a-be0a-0940a4747f89" (UID: "0f119900-0b52-425a-be0a-0940a4747f89"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:42 crc kubenswrapper[4721]: I0202 13:24:42.996364 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.021317 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f119900-0b52-425a-be0a-0940a4747f89" (UID: "0f119900-0b52-425a-be0a-0940a4747f89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.044711 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8v7w\" (UniqueName: \"kubernetes.io/projected/5dce367e-6a22-454b-bd02-4a69a739af22-kube-api-access-r8v7w\") pod \"5dce367e-6a22-454b-bd02-4a69a739af22\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.045129 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data\") pod \"5dce367e-6a22-454b-bd02-4a69a739af22\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.045366 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-scripts\") pod \"5dce367e-6a22-454b-bd02-4a69a739af22\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.045535 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data-custom\") pod \"5dce367e-6a22-454b-bd02-4a69a739af22\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.045651 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5dce367e-6a22-454b-bd02-4a69a739af22-etc-machine-id\") pod \"5dce367e-6a22-454b-bd02-4a69a739af22\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.045749 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-combined-ca-bundle\") pod \"5dce367e-6a22-454b-bd02-4a69a739af22\" (UID: \"5dce367e-6a22-454b-bd02-4a69a739af22\") " Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.046580 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7tv9\" (UniqueName: \"kubernetes.io/projected/0f119900-0b52-425a-be0a-0940a4747f89-kube-api-access-k7tv9\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.047142 4721 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.047238 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f119900-0b52-425a-be0a-0940a4747f89-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.047330 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.048029 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dce367e-6a22-454b-bd02-4a69a739af22-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5dce367e-6a22-454b-bd02-4a69a739af22" (UID: "5dce367e-6a22-454b-bd02-4a69a739af22"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.054134 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5dce367e-6a22-454b-bd02-4a69a739af22" (UID: "5dce367e-6a22-454b-bd02-4a69a739af22"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.067619 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-scripts" (OuterVolumeSpecName: "scripts") pod "5dce367e-6a22-454b-bd02-4a69a739af22" (UID: "5dce367e-6a22-454b-bd02-4a69a739af22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.076712 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dce367e-6a22-454b-bd02-4a69a739af22-kube-api-access-r8v7w" (OuterVolumeSpecName: "kube-api-access-r8v7w") pod "5dce367e-6a22-454b-bd02-4a69a739af22" (UID: "5dce367e-6a22-454b-bd02-4a69a739af22"). InnerVolumeSpecName "kube-api-access-r8v7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.106228 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data" (OuterVolumeSpecName: "config-data") pod "0f119900-0b52-425a-be0a-0940a4747f89" (UID: "0f119900-0b52-425a-be0a-0940a4747f89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.154605 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8v7w\" (UniqueName: \"kubernetes.io/projected/5dce367e-6a22-454b-bd02-4a69a739af22-kube-api-access-r8v7w\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.154636 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f119900-0b52-425a-be0a-0940a4747f89-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.154646 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.154655 4721 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.154663 4721 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5dce367e-6a22-454b-bd02-4a69a739af22-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.211242 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dce367e-6a22-454b-bd02-4a69a739af22" (UID: "5dce367e-6a22-454b-bd02-4a69a739af22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.245543 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data" (OuterVolumeSpecName: "config-data") pod "5dce367e-6a22-454b-bd02-4a69a739af22" (UID: "5dce367e-6a22-454b-bd02-4a69a739af22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.257519 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.257559 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dce367e-6a22-454b-bd02-4a69a739af22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.314999 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"32729b18-a175-4abd-a8cf-392d318b64d8","Type":"ContainerStarted","Data":"095a1845a869c5d12625722bd7e1bd061627e2ac7f6820de74ab59cf641e82b6"} Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.320192 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5dce367e-6a22-454b-bd02-4a69a739af22","Type":"ContainerDied","Data":"308b644f1394a2cc3225f510fa06697b0a8ae1e9f8e2aa6c15bae4c005148f01"} Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.320235 4721 scope.go:117] "RemoveContainer" containerID="f2ec4bfdada8e2932f35d145116d0bf335c4ba0ff8527d320662aec6daa98921" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.320231 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.328636 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.329222 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dd659f54-28qsl" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.329391 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dd659f54-28qsl" event={"ID":"0f119900-0b52-425a-be0a-0940a4747f89","Type":"ContainerDied","Data":"5366c1b45533360e87afcbc53d2e0646f4fb35a11a7eba0ff655b53f1aa8c34c"} Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.332435 4721 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="7588b1fc-b958-40e3-bec2-abc209c1a802" podUID="32729b18-a175-4abd-a8cf-392d318b64d8" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.386151 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.392022 4721 scope.go:117] "RemoveContainer" containerID="d6fc3e691cab5185ee5e5e86d01fdac6b5e28431f28e13143ff03ccfe61af6f1" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.401541 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.440908 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55dd659f54-28qsl"] Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.461406 4721 scope.go:117] "RemoveContainer" containerID="0d5c49970e3de4e3928ae616ad70c665a088d4d4722ff49627ed715494d6560a" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.461540 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 13:24:43 crc kubenswrapper[4721]: E0202 13:24:43.462045 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api-log" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.466085 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api-log" Feb 02 13:24:43 crc kubenswrapper[4721]: E0202 13:24:43.466139 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.466148 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api" Feb 02 13:24:43 crc kubenswrapper[4721]: E0202 13:24:43.466161 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dce367e-6a22-454b-bd02-4a69a739af22" containerName="cinder-scheduler" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.466170 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dce367e-6a22-454b-bd02-4a69a739af22" containerName="cinder-scheduler" Feb 02 13:24:43 crc kubenswrapper[4721]: E0202 13:24:43.466189 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dce367e-6a22-454b-bd02-4a69a739af22" containerName="probe" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.466195 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dce367e-6a22-454b-bd02-4a69a739af22" containerName="probe" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.466542 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dce367e-6a22-454b-bd02-4a69a739af22" containerName="cinder-scheduler" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.466563 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.466573 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dce367e-6a22-454b-bd02-4a69a739af22" containerName="probe" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.466592 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f119900-0b52-425a-be0a-0940a4747f89" containerName="barbican-api-log" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.467847 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.471882 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.488672 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-55dd659f54-28qsl"] Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.515618 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.566847 4721 scope.go:117] "RemoveContainer" containerID="a307dcbe4bd53270347a396194e4ad6fcdab051a06bc6fbbccd4da1332e2bfc7" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.573802 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.573951 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.574038 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdvzs\" (UniqueName: \"kubernetes.io/projected/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-kube-api-access-sdvzs\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.574207 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-scripts\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.574286 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.574348 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-config-data\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.676445 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.676532 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.676575 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdvzs\" (UniqueName: \"kubernetes.io/projected/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-kube-api-access-sdvzs\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.676634 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.676653 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-scripts\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.676824 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.676866 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-config-data\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.680367 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-scripts\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.681964 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-config-data\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.682401 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.682560 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.697567 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdvzs\" (UniqueName: \"kubernetes.io/projected/7d63f1df-bbdc-42ee-a234-2d691a3ce7ba-kube-api-access-sdvzs\") pod \"cinder-scheduler-0\" (UID: \"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba\") " pod="openstack/cinder-scheduler-0" Feb 02 13:24:43 crc kubenswrapper[4721]: I0202 13:24:43.829896 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 13:24:44 crc kubenswrapper[4721]: I0202 13:24:44.398909 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 13:24:44 crc kubenswrapper[4721]: I0202 13:24:44.434461 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f119900-0b52-425a-be0a-0940a4747f89" path="/var/lib/kubelet/pods/0f119900-0b52-425a-be0a-0940a4747f89/volumes" Feb 02 13:24:44 crc kubenswrapper[4721]: I0202 13:24:44.435420 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dce367e-6a22-454b-bd02-4a69a739af22" path="/var/lib/kubelet/pods/5dce367e-6a22-454b-bd02-4a69a739af22/volumes" Feb 02 13:24:44 crc kubenswrapper[4721]: I0202 13:24:44.436234 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7588b1fc-b958-40e3-bec2-abc209c1a802" path="/var/lib/kubelet/pods/7588b1fc-b958-40e3-bec2-abc209c1a802/volumes" Feb 02 13:24:44 crc kubenswrapper[4721]: I0202 13:24:44.697222 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 02 13:24:44 crc kubenswrapper[4721]: I0202 13:24:44.763231 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:24:44 crc kubenswrapper[4721]: I0202 13:24:44.763284 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:24:45 crc kubenswrapper[4721]: I0202 13:24:45.097509 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlbxn" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" probeResult="failure" output=< Feb 02 13:24:45 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:24:45 crc kubenswrapper[4721]: > Feb 02 13:24:45 crc kubenswrapper[4721]: I0202 13:24:45.113251 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:45 crc kubenswrapper[4721]: I0202 13:24:45.176254 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6ccdcdf5fb-gncnr" Feb 02 13:24:45 crc kubenswrapper[4721]: I0202 13:24:45.295243 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-75b75c495b-kpsxz"] Feb 02 13:24:45 crc kubenswrapper[4721]: I0202 13:24:45.295733 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-75b75c495b-kpsxz" podUID="873ec78b-5777-4560-a744-c4789b43d966" containerName="placement-log" containerID="cri-o://be87e0c26b43f57fe8bd6e999c716fddff3bcf7e4eec9b29a39b86ceafba0594" gracePeriod=30 Feb 02 13:24:45 crc kubenswrapper[4721]: I0202 13:24:45.296306 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-75b75c495b-kpsxz" podUID="873ec78b-5777-4560-a744-c4789b43d966" containerName="placement-api" containerID="cri-o://9340d886174492f6ae0a15f3f3e7b662045af6f27123113fd5b1ff49e71bab73" gracePeriod=30 Feb 02 13:24:45 crc kubenswrapper[4721]: I0202 13:24:45.430479 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba","Type":"ContainerStarted","Data":"b970b10b4d8b2db37f91db7ce227dfec2e7533ccd3d9f4a2de977aba52503140"} Feb 02 13:24:45 crc kubenswrapper[4721]: I0202 13:24:45.430514 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba","Type":"ContainerStarted","Data":"49d0dd9de3db0f2855ebbfa61b9e326e50f23cf02562ca89635931fe6f270729"} Feb 02 13:24:46 crc kubenswrapper[4721]: I0202 13:24:46.451976 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7d63f1df-bbdc-42ee-a234-2d691a3ce7ba","Type":"ContainerStarted","Data":"c72bbd2a66e905e47f9cac569f2d5aea00e3740eddfea99d3ab627e9a5a914cf"} Feb 02 13:24:46 crc kubenswrapper[4721]: I0202 13:24:46.460033 4721 generic.go:334] "Generic (PLEG): container finished" podID="873ec78b-5777-4560-a744-c4789b43d966" containerID="be87e0c26b43f57fe8bd6e999c716fddff3bcf7e4eec9b29a39b86ceafba0594" exitCode=143 Feb 02 13:24:46 crc kubenswrapper[4721]: I0202 13:24:46.460093 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b75c495b-kpsxz" event={"ID":"873ec78b-5777-4560-a744-c4789b43d966","Type":"ContainerDied","Data":"be87e0c26b43f57fe8bd6e999c716fddff3bcf7e4eec9b29a39b86ceafba0594"} Feb 02 13:24:46 crc kubenswrapper[4721]: I0202 13:24:46.473790 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.473766332 podStartE2EDuration="3.473766332s" podCreationTimestamp="2026-02-02 13:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:46.469920828 +0000 UTC m=+1426.772435217" watchObservedRunningTime="2026-02-02 13:24:46.473766332 +0000 UTC m=+1426.776280721" Feb 02 13:24:46 crc kubenswrapper[4721]: I0202 13:24:46.663376 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:46 crc kubenswrapper[4721]: I0202 13:24:46.733688 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:46 crc kubenswrapper[4721]: I0202 13:24:46.901898 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2blk"] Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.831698 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-9b87bd57c-2glsn"] Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.833935 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.841572 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.841846 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.842136 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.844149 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9b87bd57c-2glsn"] Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.904214 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-run-httpd\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.904262 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-log-httpd\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.904335 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-combined-ca-bundle\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.904412 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-public-tls-certs\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.904460 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-config-data\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.904478 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-etc-swift\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.904534 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-internal-tls-certs\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:47 crc kubenswrapper[4721]: I0202 13:24:47.904595 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t252c\" (UniqueName: \"kubernetes.io/projected/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-kube-api-access-t252c\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.007105 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-combined-ca-bundle\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.007253 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-public-tls-certs\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.008210 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-config-data\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.008243 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-etc-swift\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.008331 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-internal-tls-certs\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.008424 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t252c\" (UniqueName: \"kubernetes.io/projected/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-kube-api-access-t252c\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.008511 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-run-httpd\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.008548 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-log-httpd\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.009017 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-log-httpd\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.010177 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-run-httpd\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.013695 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-public-tls-certs\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.014402 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-config-data\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.015225 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-etc-swift\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.016591 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-combined-ca-bundle\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.016739 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-internal-tls-certs\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.031596 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t252c\" (UniqueName: \"kubernetes.io/projected/c04183e6-a1f0-4d8c-aa00-8dd660336a3b-kube-api-access-t252c\") pod \"swift-proxy-9b87bd57c-2glsn\" (UID: \"c04183e6-a1f0-4d8c-aa00-8dd660336a3b\") " pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.182397 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.496913 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b2blk" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="registry-server" containerID="cri-o://fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de" gracePeriod=2 Feb 02 13:24:48 crc kubenswrapper[4721]: I0202 13:24:48.830263 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.421977 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9b87bd57c-2glsn"] Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.446690 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:49 crc kubenswrapper[4721]: W0202 13:24:49.453151 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc04183e6_a1f0_4d8c_aa00_8dd660336a3b.slice/crio-5dc8ef2af1518df2d6f4893a51d2dd25291bfd1ab7d8a637f7ca69a704b0a1ea WatchSource:0}: Error finding container 5dc8ef2af1518df2d6f4893a51d2dd25291bfd1ab7d8a637f7ca69a704b0a1ea: Status 404 returned error can't find the container with id 5dc8ef2af1518df2d6f4893a51d2dd25291bfd1ab7d8a637f7ca69a704b0a1ea Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.554339 4721 generic.go:334] "Generic (PLEG): container finished" podID="873ec78b-5777-4560-a744-c4789b43d966" containerID="9340d886174492f6ae0a15f3f3e7b662045af6f27123113fd5b1ff49e71bab73" exitCode=0 Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.554493 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b75c495b-kpsxz" event={"ID":"873ec78b-5777-4560-a744-c4789b43d966","Type":"ContainerDied","Data":"9340d886174492f6ae0a15f3f3e7b662045af6f27123113fd5b1ff49e71bab73"} Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.570943 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-catalog-content\") pod \"cb085bc7-03fe-45d5-8293-754aa8a47e79\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.571138 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqx6q\" (UniqueName: \"kubernetes.io/projected/cb085bc7-03fe-45d5-8293-754aa8a47e79-kube-api-access-dqx6q\") pod \"cb085bc7-03fe-45d5-8293-754aa8a47e79\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.571218 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-utilities\") pod \"cb085bc7-03fe-45d5-8293-754aa8a47e79\" (UID: \"cb085bc7-03fe-45d5-8293-754aa8a47e79\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.577789 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-utilities" (OuterVolumeSpecName: "utilities") pod "cb085bc7-03fe-45d5-8293-754aa8a47e79" (UID: "cb085bc7-03fe-45d5-8293-754aa8a47e79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.588754 4721 generic.go:334] "Generic (PLEG): container finished" podID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerID="fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de" exitCode=0 Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.588821 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2blk" event={"ID":"cb085bc7-03fe-45d5-8293-754aa8a47e79","Type":"ContainerDied","Data":"fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de"} Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.588850 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2blk" event={"ID":"cb085bc7-03fe-45d5-8293-754aa8a47e79","Type":"ContainerDied","Data":"ca306f9ea887ebe52384a1958a49ecea30993901251ae5049666a5cc6680fe83"} Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.588866 4721 scope.go:117] "RemoveContainer" containerID="fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.589039 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2blk" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.592142 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb085bc7-03fe-45d5-8293-754aa8a47e79-kube-api-access-dqx6q" (OuterVolumeSpecName: "kube-api-access-dqx6q") pod "cb085bc7-03fe-45d5-8293-754aa8a47e79" (UID: "cb085bc7-03fe-45d5-8293-754aa8a47e79"). InnerVolumeSpecName "kube-api-access-dqx6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.598794 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9b87bd57c-2glsn" event={"ID":"c04183e6-a1f0-4d8c-aa00-8dd660336a3b","Type":"ContainerStarted","Data":"5dc8ef2af1518df2d6f4893a51d2dd25291bfd1ab7d8a637f7ca69a704b0a1ea"} Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.623997 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.637982 4721 scope.go:117] "RemoveContainer" containerID="1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.669608 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb085bc7-03fe-45d5-8293-754aa8a47e79" (UID: "cb085bc7-03fe-45d5-8293-754aa8a47e79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.679541 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-scripts\") pod \"873ec78b-5777-4560-a744-c4789b43d966\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.681951 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-config-data\") pod \"873ec78b-5777-4560-a744-c4789b43d966\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.682031 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-internal-tls-certs\") pod \"873ec78b-5777-4560-a744-c4789b43d966\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.682207 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-public-tls-certs\") pod \"873ec78b-5777-4560-a744-c4789b43d966\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.683584 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-combined-ca-bundle\") pod \"873ec78b-5777-4560-a744-c4789b43d966\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.683645 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngbhh\" (UniqueName: \"kubernetes.io/projected/873ec78b-5777-4560-a744-c4789b43d966-kube-api-access-ngbhh\") pod \"873ec78b-5777-4560-a744-c4789b43d966\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.683742 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873ec78b-5777-4560-a744-c4789b43d966-logs\") pod \"873ec78b-5777-4560-a744-c4789b43d966\" (UID: \"873ec78b-5777-4560-a744-c4789b43d966\") " Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.705833 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqx6q\" (UniqueName: \"kubernetes.io/projected/cb085bc7-03fe-45d5-8293-754aa8a47e79-kube-api-access-dqx6q\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.705871 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.705886 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb085bc7-03fe-45d5-8293-754aa8a47e79-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.705943 4721 scope.go:117] "RemoveContainer" containerID="5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.706201 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873ec78b-5777-4560-a744-c4789b43d966-logs" (OuterVolumeSpecName: "logs") pod "873ec78b-5777-4560-a744-c4789b43d966" (UID: "873ec78b-5777-4560-a744-c4789b43d966"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.717267 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-scripts" (OuterVolumeSpecName: "scripts") pod "873ec78b-5777-4560-a744-c4789b43d966" (UID: "873ec78b-5777-4560-a744-c4789b43d966"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.717617 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873ec78b-5777-4560-a744-c4789b43d966-kube-api-access-ngbhh" (OuterVolumeSpecName: "kube-api-access-ngbhh") pod "873ec78b-5777-4560-a744-c4789b43d966" (UID: "873ec78b-5777-4560-a744-c4789b43d966"). InnerVolumeSpecName "kube-api-access-ngbhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.786407 4721 scope.go:117] "RemoveContainer" containerID="fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de" Feb 02 13:24:49 crc kubenswrapper[4721]: E0202 13:24:49.787279 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de\": container with ID starting with fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de not found: ID does not exist" containerID="fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.787315 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de"} err="failed to get container status \"fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de\": rpc error: code = NotFound desc = could not find container \"fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de\": container with ID starting with fe87ede3ed73966fcb2f3ff06b67149c74278e5cdd6e6a2388c62a80901815de not found: ID does not exist" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.787338 4721 scope.go:117] "RemoveContainer" containerID="1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f" Feb 02 13:24:49 crc kubenswrapper[4721]: E0202 13:24:49.787762 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f\": container with ID starting with 1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f not found: ID does not exist" containerID="1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.787780 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f"} err="failed to get container status \"1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f\": rpc error: code = NotFound desc = could not find container \"1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f\": container with ID starting with 1657f0f86f0baa3fd5b4910f2d2a3f7f01376b4573cd7a450f9ac3d3c0377f0f not found: ID does not exist" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.787794 4721 scope.go:117] "RemoveContainer" containerID="5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396" Feb 02 13:24:49 crc kubenswrapper[4721]: E0202 13:24:49.788021 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396\": container with ID starting with 5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396 not found: ID does not exist" containerID="5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.788039 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396"} err="failed to get container status \"5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396\": rpc error: code = NotFound desc = could not find container \"5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396\": container with ID starting with 5e7d094ebf15b03eae8cab7968f99b33fb74f6a236c1143143fd5cef22da4396 not found: ID does not exist" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.818971 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngbhh\" (UniqueName: \"kubernetes.io/projected/873ec78b-5777-4560-a744-c4789b43d966-kube-api-access-ngbhh\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.819016 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/873ec78b-5777-4560-a744-c4789b43d966-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.819034 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.827840 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-config-data" (OuterVolumeSpecName: "config-data") pod "873ec78b-5777-4560-a744-c4789b43d966" (UID: "873ec78b-5777-4560-a744-c4789b43d966"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.852982 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "873ec78b-5777-4560-a744-c4789b43d966" (UID: "873ec78b-5777-4560-a744-c4789b43d966"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.917610 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "873ec78b-5777-4560-a744-c4789b43d966" (UID: "873ec78b-5777-4560-a744-c4789b43d966"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.928336 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.932713 4721 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.932748 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.969545 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "873ec78b-5777-4560-a744-c4789b43d966" (UID: "873ec78b-5777-4560-a744-c4789b43d966"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.977351 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2blk"] Feb 02 13:24:49 crc kubenswrapper[4721]: I0202 13:24:49.992528 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b2blk"] Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.035058 4721 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/873ec78b-5777-4560-a744-c4789b43d966-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.432972 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" path="/var/lib/kubelet/pods/cb085bc7-03fe-45d5-8293-754aa8a47e79/volumes" Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.631306 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75b75c495b-kpsxz" event={"ID":"873ec78b-5777-4560-a744-c4789b43d966","Type":"ContainerDied","Data":"bef5ef7eca11516e2b9cce9579ac419ba493f62c89b699687290738232336cce"} Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.631363 4721 scope.go:117] "RemoveContainer" containerID="9340d886174492f6ae0a15f3f3e7b662045af6f27123113fd5b1ff49e71bab73" Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.631481 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75b75c495b-kpsxz" Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.653179 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9b87bd57c-2glsn" event={"ID":"c04183e6-a1f0-4d8c-aa00-8dd660336a3b","Type":"ContainerStarted","Data":"e5cda999484cdb96ee93ed2a22b964372c74581db1efaf3b1e3eb994a5a492ce"} Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.653229 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9b87bd57c-2glsn" event={"ID":"c04183e6-a1f0-4d8c-aa00-8dd660336a3b","Type":"ContainerStarted","Data":"860409a2d3a132da8f27bfee83b6ab2893aea6a337436216e9ca0ca552f22f90"} Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.653680 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.653726 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.662284 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-75b75c495b-kpsxz"] Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.696518 4721 scope.go:117] "RemoveContainer" containerID="be87e0c26b43f57fe8bd6e999c716fddff3bcf7e4eec9b29a39b86ceafba0594" Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.702361 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-75b75c495b-kpsxz"] Feb 02 13:24:50 crc kubenswrapper[4721]: I0202 13:24:50.707273 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-9b87bd57c-2glsn" podStartSLOduration=3.707255178 podStartE2EDuration="3.707255178s" podCreationTimestamp="2026-02-02 13:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:24:50.683570466 +0000 UTC m=+1430.986084865" watchObservedRunningTime="2026-02-02 13:24:50.707255178 +0000 UTC m=+1431.009769567" Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.331967 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.332625 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="ceilometer-central-agent" containerID="cri-o://e7a1435ecc8600326126b2f72cf42852a2ff26324fb369f144ca54a8e7a02fe8" gracePeriod=30 Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.333199 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="proxy-httpd" containerID="cri-o://f6ced5c029fbb0ced62ee14010b08872173b1b7d00f61d4cb14f85035de775b1" gracePeriod=30 Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.333347 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="sg-core" containerID="cri-o://4d53de923a6215191fa0ab435e823d19598bea6ec82ecee59ac252608725c58c" gracePeriod=30 Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.333420 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="ceilometer-notification-agent" containerID="cri-o://0b3cb07d111bcefe4cce6e8cdec0586b864539201a7ae3eef461e7791d1c123a" gracePeriod=30 Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.348601 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.209:3000/\": EOF" Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.510869 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-74cc678f5-fkzpw" Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.609998 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7556fd87fb-z78lc"] Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.610277 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7556fd87fb-z78lc" podUID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerName="neutron-api" containerID="cri-o://4badbbeb19bd1ce4dabd3dbcb1949a66c97f449e7bcb0e940791d3782f7337b1" gracePeriod=30 Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.610674 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7556fd87fb-z78lc" podUID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerName="neutron-httpd" containerID="cri-o://8f88f40e370779b33eeb9ef5e263779063066216a67b07cb99c3e5b293638a3d" gracePeriod=30 Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.746283 4721 generic.go:334] "Generic (PLEG): container finished" podID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerID="f6ced5c029fbb0ced62ee14010b08872173b1b7d00f61d4cb14f85035de775b1" exitCode=0 Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.746318 4721 generic.go:334] "Generic (PLEG): container finished" podID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerID="4d53de923a6215191fa0ab435e823d19598bea6ec82ecee59ac252608725c58c" exitCode=2 Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.746384 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerDied","Data":"f6ced5c029fbb0ced62ee14010b08872173b1b7d00f61d4cb14f85035de775b1"} Feb 02 13:24:51 crc kubenswrapper[4721]: I0202 13:24:51.746410 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerDied","Data":"4d53de923a6215191fa0ab435e823d19598bea6ec82ecee59ac252608725c58c"} Feb 02 13:24:52 crc kubenswrapper[4721]: I0202 13:24:52.425934 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873ec78b-5777-4560-a744-c4789b43d966" path="/var/lib/kubelet/pods/873ec78b-5777-4560-a744-c4789b43d966/volumes" Feb 02 13:24:52 crc kubenswrapper[4721]: I0202 13:24:52.684483 4721 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod1dc1a7fa-c727-45f1-a53a-e9a5bc059fa5] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1dc1a7fa_c727_45f1_a53a_e9a5bc059fa5.slice" Feb 02 13:24:52 crc kubenswrapper[4721]: I0202 13:24:52.790956 4721 generic.go:334] "Generic (PLEG): container finished" podID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerID="8f88f40e370779b33eeb9ef5e263779063066216a67b07cb99c3e5b293638a3d" exitCode=0 Feb 02 13:24:52 crc kubenswrapper[4721]: I0202 13:24:52.791020 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7556fd87fb-z78lc" event={"ID":"6f746721-5da3-4418-8ef6-d0b88f2121bc","Type":"ContainerDied","Data":"8f88f40e370779b33eeb9ef5e263779063066216a67b07cb99c3e5b293638a3d"} Feb 02 13:24:52 crc kubenswrapper[4721]: I0202 13:24:52.799331 4721 generic.go:334] "Generic (PLEG): container finished" podID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerID="e7a1435ecc8600326126b2f72cf42852a2ff26324fb369f144ca54a8e7a02fe8" exitCode=0 Feb 02 13:24:52 crc kubenswrapper[4721]: I0202 13:24:52.799372 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerDied","Data":"e7a1435ecc8600326126b2f72cf42852a2ff26324fb369f144ca54a8e7a02fe8"} Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.165978 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.209:3000/\": dial tcp 10.217.0.209:3000: connect: connection refused" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.665203 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-69599c8b5f-rjs76"] Feb 02 13:24:53 crc kubenswrapper[4721]: E0202 13:24:53.666020 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873ec78b-5777-4560-a744-c4789b43d966" containerName="placement-api" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.666044 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="873ec78b-5777-4560-a744-c4789b43d966" containerName="placement-api" Feb 02 13:24:53 crc kubenswrapper[4721]: E0202 13:24:53.666080 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="registry-server" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.666089 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="registry-server" Feb 02 13:24:53 crc kubenswrapper[4721]: E0202 13:24:53.666113 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873ec78b-5777-4560-a744-c4789b43d966" containerName="placement-log" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.666121 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="873ec78b-5777-4560-a744-c4789b43d966" containerName="placement-log" Feb 02 13:24:53 crc kubenswrapper[4721]: E0202 13:24:53.666156 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="extract-utilities" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.666164 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="extract-utilities" Feb 02 13:24:53 crc kubenswrapper[4721]: E0202 13:24:53.666194 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="extract-content" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.666202 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="extract-content" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.666453 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb085bc7-03fe-45d5-8293-754aa8a47e79" containerName="registry-server" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.666516 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="873ec78b-5777-4560-a744-c4789b43d966" containerName="placement-log" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.666551 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="873ec78b-5777-4560-a744-c4789b43d966" containerName="placement-api" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.667575 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.674906 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.675205 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-ldgvp" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.675363 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.687104 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-69599c8b5f-rjs76"] Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.772435 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data-custom\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.772634 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-combined-ca-bundle\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.772701 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxc9d\" (UniqueName: \"kubernetes.io/projected/ce072a84-75da-4060-9c4a-d029b3a14947-kube-api-access-wxc9d\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.772721 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.792715 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-m87tn"] Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.794624 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.814833 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-m87tn"] Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.868085 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-59bff7fb85-wq6q5"] Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.870042 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.874451 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxc9d\" (UniqueName: \"kubernetes.io/projected/ce072a84-75da-4060-9c4a-d029b3a14947-kube-api-access-wxc9d\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.874492 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.874542 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data-custom\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.874685 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-combined-ca-bundle\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.875146 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.882022 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data-custom\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.895968 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.905814 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-combined-ca-bundle\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.914199 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-59bff7fb85-wq6q5"] Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.917083 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxc9d\" (UniqueName: \"kubernetes.io/projected/ce072a84-75da-4060-9c4a-d029b3a14947-kube-api-access-wxc9d\") pod \"heat-engine-69599c8b5f-rjs76\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.963653 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-8588ddc4dc-rq722"] Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.966025 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.973640 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.976791 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.976890 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crj69\" (UniqueName: \"kubernetes.io/projected/a217ca40-3638-474b-b739-cb8784823fa6-kube-api-access-crj69\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.976918 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.976987 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-combined-ca-bundle\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.977008 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data-custom\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.977034 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.977062 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jwmm\" (UniqueName: \"kubernetes.io/projected/bddd12fa-0653-4199-867f-bfdf51350b39-kube-api-access-6jwmm\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.977116 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.977153 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.977176 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-config\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:53 crc kubenswrapper[4721]: I0202 13:24:53.990249 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-8588ddc4dc-rq722"] Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.005667 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.079219 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-combined-ca-bundle\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080055 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data-custom\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080351 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data-custom\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080399 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080425 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cwgk\" (UniqueName: \"kubernetes.io/projected/ee4f36c4-39c4-4cb4-b24c-676b76966752-kube-api-access-9cwgk\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080463 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-combined-ca-bundle\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080507 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jwmm\" (UniqueName: \"kubernetes.io/projected/bddd12fa-0653-4199-867f-bfdf51350b39-kube-api-access-6jwmm\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080587 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080627 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080677 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080726 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-config\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.080852 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.081011 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crj69\" (UniqueName: \"kubernetes.io/projected/a217ca40-3638-474b-b739-cb8784823fa6-kube-api-access-crj69\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.081048 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.082043 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.082088 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-config\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.082495 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.083185 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.084893 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-combined-ca-bundle\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.088553 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.090823 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data-custom\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.112534 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.116876 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jwmm\" (UniqueName: \"kubernetes.io/projected/bddd12fa-0653-4199-867f-bfdf51350b39-kube-api-access-6jwmm\") pod \"heat-cfnapi-59bff7fb85-wq6q5\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.119763 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crj69\" (UniqueName: \"kubernetes.io/projected/a217ca40-3638-474b-b739-cb8784823fa6-kube-api-access-crj69\") pod \"dnsmasq-dns-7756b9d78c-m87tn\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.134661 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.183395 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data-custom\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.183442 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cwgk\" (UniqueName: \"kubernetes.io/projected/ee4f36c4-39c4-4cb4-b24c-676b76966752-kube-api-access-9cwgk\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.183466 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-combined-ca-bundle\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.183520 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.190281 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.190832 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data-custom\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.191875 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-combined-ca-bundle\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.207819 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cwgk\" (UniqueName: \"kubernetes.io/projected/ee4f36c4-39c4-4cb4-b24c-676b76966752-kube-api-access-9cwgk\") pod \"heat-api-8588ddc4dc-rq722\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.292847 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.316176 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.335726 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.844199 4721 generic.go:334] "Generic (PLEG): container finished" podID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerID="0b3cb07d111bcefe4cce6e8cdec0586b864539201a7ae3eef461e7791d1c123a" exitCode=0 Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.844520 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerDied","Data":"0b3cb07d111bcefe4cce6e8cdec0586b864539201a7ae3eef461e7791d1c123a"} Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.846187 4721 generic.go:334] "Generic (PLEG): container finished" podID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerID="4badbbeb19bd1ce4dabd3dbcb1949a66c97f449e7bcb0e940791d3782f7337b1" exitCode=0 Feb 02 13:24:54 crc kubenswrapper[4721]: I0202 13:24:54.846208 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7556fd87fb-z78lc" event={"ID":"6f746721-5da3-4418-8ef6-d0b88f2121bc","Type":"ContainerDied","Data":"4badbbeb19bd1ce4dabd3dbcb1949a66c97f449e7bcb0e940791d3782f7337b1"} Feb 02 13:24:55 crc kubenswrapper[4721]: I0202 13:24:55.084496 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlbxn" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" probeResult="failure" output=< Feb 02 13:24:55 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:24:55 crc kubenswrapper[4721]: > Feb 02 13:24:58 crc kubenswrapper[4721]: I0202 13:24:58.192799 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:24:58 crc kubenswrapper[4721]: I0202 13:24:58.216334 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9b87bd57c-2glsn" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.482128 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-596786fd64-rpzql"] Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.534612 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.540107 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-596786fd64-rpzql"] Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.569001 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-869db4994-hgxnh"] Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.580487 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.619219 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6c76b54b86-n9kln"] Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.620878 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.630286 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-869db4994-hgxnh"] Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745414 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-combined-ca-bundle\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745512 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-config-data\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745584 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs6zw\" (UniqueName: \"kubernetes.io/projected/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-kube-api-access-xs6zw\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745649 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-combined-ca-bundle\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745678 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data-custom\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745700 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745828 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95q9j\" (UniqueName: \"kubernetes.io/projected/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-kube-api-access-95q9j\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745906 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data-custom\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745930 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745953 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-config-data-custom\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.745993 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-combined-ca-bundle\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.746057 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vs22\" (UniqueName: \"kubernetes.io/projected/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-kube-api-access-7vs22\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.746239 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c76b54b86-n9kln"] Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.849555 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.850170 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-config-data-custom\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.850237 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-combined-ca-bundle\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.850322 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vs22\" (UniqueName: \"kubernetes.io/projected/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-kube-api-access-7vs22\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.851328 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-combined-ca-bundle\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.851536 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-config-data\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.851700 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs6zw\" (UniqueName: \"kubernetes.io/projected/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-kube-api-access-xs6zw\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.851885 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-combined-ca-bundle\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.851941 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data-custom\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.851976 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.852395 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95q9j\" (UniqueName: \"kubernetes.io/projected/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-kube-api-access-95q9j\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.852645 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data-custom\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.859372 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-config-data-custom\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.862484 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-combined-ca-bundle\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.863335 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.863935 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-combined-ca-bundle\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.873056 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data-custom\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.874178 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-config-data\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.875288 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.884832 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95q9j\" (UniqueName: \"kubernetes.io/projected/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-kube-api-access-95q9j\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.884848 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-combined-ca-bundle\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.885034 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data-custom\") pod \"heat-api-869db4994-hgxnh\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.893304 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs6zw\" (UniqueName: \"kubernetes.io/projected/6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c-kube-api-access-xs6zw\") pod \"heat-engine-596786fd64-rpzql\" (UID: \"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c\") " pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.901877 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vs22\" (UniqueName: \"kubernetes.io/projected/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-kube-api-access-7vs22\") pod \"heat-cfnapi-6c76b54b86-n9kln\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:00 crc kubenswrapper[4721]: I0202 13:25:00.926668 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:01 crc kubenswrapper[4721]: I0202 13:25:01.005038 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:01 crc kubenswrapper[4721]: I0202 13:25:01.169143 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.573375 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-8588ddc4dc-rq722"] Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.621149 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-59bff7fb85-wq6q5"] Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.646153 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-86974d69bd-t6gcz"] Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.651656 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.655797 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.663366 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.669949 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-64c55c4cc7-4htzp"] Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.671832 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.679310 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.680458 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.692014 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-64c55c4cc7-4htzp"] Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.721151 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-86974d69bd-t6gcz"] Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.810698 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-config-data-custom\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.810739 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-internal-tls-certs\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.810770 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-public-tls-certs\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.810803 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-config-data\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.810822 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-internal-tls-certs\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.810853 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjzhs\" (UniqueName: \"kubernetes.io/projected/5d1412d5-76f7-4132-889d-f706432b3ecc-kube-api-access-qjzhs\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.811013 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-public-tls-certs\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.811167 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wfhs\" (UniqueName: \"kubernetes.io/projected/5c23b064-e24b-4ab3-886d-d731004b7479-kube-api-access-4wfhs\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.811957 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-combined-ca-bundle\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.812211 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-config-data\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.812366 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-combined-ca-bundle\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.812599 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-config-data-custom\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.919129 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-combined-ca-bundle\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.919783 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-config-data\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.919830 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-combined-ca-bundle\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.919927 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-config-data-custom\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.919975 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-config-data-custom\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.920000 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-internal-tls-certs\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.920033 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-public-tls-certs\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.920146 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-internal-tls-certs\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.920217 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjzhs\" (UniqueName: \"kubernetes.io/projected/5d1412d5-76f7-4132-889d-f706432b3ecc-kube-api-access-qjzhs\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.920253 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-config-data\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.920299 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-public-tls-certs\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.920351 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wfhs\" (UniqueName: \"kubernetes.io/projected/5c23b064-e24b-4ab3-886d-d731004b7479-kube-api-access-4wfhs\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.970115 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wfhs\" (UniqueName: \"kubernetes.io/projected/5c23b064-e24b-4ab3-886d-d731004b7479-kube-api-access-4wfhs\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.972658 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjzhs\" (UniqueName: \"kubernetes.io/projected/5d1412d5-76f7-4132-889d-f706432b3ecc-kube-api-access-qjzhs\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.973307 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-config-data-custom\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.973980 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-public-tls-certs\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.974573 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-combined-ca-bundle\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.975316 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-public-tls-certs\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.975517 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-internal-tls-certs\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.977003 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-internal-tls-certs\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.977946 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-combined-ca-bundle\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.977953 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d1412d5-76f7-4132-889d-f706432b3ecc-config-data\") pod \"heat-api-86974d69bd-t6gcz\" (UID: \"5d1412d5-76f7-4132-889d-f706432b3ecc\") " pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.988486 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-config-data\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:02 crc kubenswrapper[4721]: I0202 13:25:02.996845 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c23b064-e24b-4ab3-886d-d731004b7479-config-data-custom\") pod \"heat-cfnapi-64c55c4cc7-4htzp\" (UID: \"5c23b064-e24b-4ab3-886d-d731004b7479\") " pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:03 crc kubenswrapper[4721]: I0202 13:25:03.077044 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:03 crc kubenswrapper[4721]: I0202 13:25:03.109983 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:03 crc kubenswrapper[4721]: I0202 13:25:03.534945 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-8588ddc4dc-rq722"] Feb 02 13:25:03 crc kubenswrapper[4721]: I0202 13:25:03.939645 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:25:03 crc kubenswrapper[4721]: I0202 13:25:03.956605 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.002727 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-scripts\") pod \"a1482d2e-b885-44bd-b679-109f0b9698ea\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.003165 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-config-data\") pod \"a1482d2e-b885-44bd-b679-109f0b9698ea\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.003251 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-log-httpd\") pod \"a1482d2e-b885-44bd-b679-109f0b9698ea\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.003336 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-run-httpd\") pod \"a1482d2e-b885-44bd-b679-109f0b9698ea\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.003351 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-combined-ca-bundle\") pod \"a1482d2e-b885-44bd-b679-109f0b9698ea\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.003372 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-sg-core-conf-yaml\") pod \"a1482d2e-b885-44bd-b679-109f0b9698ea\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.003443 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4mxr\" (UniqueName: \"kubernetes.io/projected/a1482d2e-b885-44bd-b679-109f0b9698ea-kube-api-access-c4mxr\") pod \"a1482d2e-b885-44bd-b679-109f0b9698ea\" (UID: \"a1482d2e-b885-44bd-b679-109f0b9698ea\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.005920 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a1482d2e-b885-44bd-b679-109f0b9698ea" (UID: "a1482d2e-b885-44bd-b679-109f0b9698ea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.006679 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a1482d2e-b885-44bd-b679-109f0b9698ea" (UID: "a1482d2e-b885-44bd-b679-109f0b9698ea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.074482 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-scripts" (OuterVolumeSpecName: "scripts") pod "a1482d2e-b885-44bd-b679-109f0b9698ea" (UID: "a1482d2e-b885-44bd-b679-109f0b9698ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.092106 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1482d2e-b885-44bd-b679-109f0b9698ea-kube-api-access-c4mxr" (OuterVolumeSpecName: "kube-api-access-c4mxr") pod "a1482d2e-b885-44bd-b679-109f0b9698ea" (UID: "a1482d2e-b885-44bd-b679-109f0b9698ea"). InnerVolumeSpecName "kube-api-access-c4mxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.112345 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw65p\" (UniqueName: \"kubernetes.io/projected/6f746721-5da3-4418-8ef6-d0b88f2121bc-kube-api-access-jw65p\") pod \"6f746721-5da3-4418-8ef6-d0b88f2121bc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.112450 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-combined-ca-bundle\") pod \"6f746721-5da3-4418-8ef6-d0b88f2121bc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.112671 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-ovndb-tls-certs\") pod \"6f746721-5da3-4418-8ef6-d0b88f2121bc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.112735 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-httpd-config\") pod \"6f746721-5da3-4418-8ef6-d0b88f2121bc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.112776 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-config\") pod \"6f746721-5da3-4418-8ef6-d0b88f2121bc\" (UID: \"6f746721-5da3-4418-8ef6-d0b88f2121bc\") " Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.113359 4721 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.113371 4721 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1482d2e-b885-44bd-b679-109f0b9698ea-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.113381 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4mxr\" (UniqueName: \"kubernetes.io/projected/a1482d2e-b885-44bd-b679-109f0b9698ea-kube-api-access-c4mxr\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.113391 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.128373 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f746721-5da3-4418-8ef6-d0b88f2121bc-kube-api-access-jw65p" (OuterVolumeSpecName: "kube-api-access-jw65p") pod "6f746721-5da3-4418-8ef6-d0b88f2121bc" (UID: "6f746721-5da3-4418-8ef6-d0b88f2121bc"). InnerVolumeSpecName "kube-api-access-jw65p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.129286 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6f746721-5da3-4418-8ef6-d0b88f2121bc" (UID: "6f746721-5da3-4418-8ef6-d0b88f2121bc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.206618 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8588ddc4dc-rq722" event={"ID":"ee4f36c4-39c4-4cb4-b24c-676b76966752","Type":"ContainerStarted","Data":"22a62ef40c5e00afd66630e4761effc79fe1d27fa65d0590da25c3c1d43ad9bf"} Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.220883 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw65p\" (UniqueName: \"kubernetes.io/projected/6f746721-5da3-4418-8ef6-d0b88f2121bc-kube-api-access-jw65p\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.220924 4721 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.269035 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f746721-5da3-4418-8ef6-d0b88f2121bc" (UID: "6f746721-5da3-4418-8ef6-d0b88f2121bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.270965 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7556fd87fb-z78lc" event={"ID":"6f746721-5da3-4418-8ef6-d0b88f2121bc","Type":"ContainerDied","Data":"8cc4a5e49bcdc1259392f527ba7a63bedab94aac24105814e1cdaa17c7280e6e"} Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.271296 4721 scope.go:117] "RemoveContainer" containerID="8f88f40e370779b33eeb9ef5e263779063066216a67b07cb99c3e5b293638a3d" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.271495 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7556fd87fb-z78lc" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.284361 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-config" (OuterVolumeSpecName: "config") pod "6f746721-5da3-4418-8ef6-d0b88f2121bc" (UID: "6f746721-5da3-4418-8ef6-d0b88f2121bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.303394 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1482d2e-b885-44bd-b679-109f0b9698ea","Type":"ContainerDied","Data":"c455b716179204c8c0342b961a933847d0497bb7d80b63f7c4dd9a07665bceb2"} Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.303537 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.314520 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a1482d2e-b885-44bd-b679-109f0b9698ea" (UID: "a1482d2e-b885-44bd-b679-109f0b9698ea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.331298 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.331337 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.331355 4721 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.355294 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6f746721-5da3-4418-8ef6-d0b88f2121bc" (UID: "6f746721-5da3-4418-8ef6-d0b88f2121bc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.409464 4721 scope.go:117] "RemoveContainer" containerID="4badbbeb19bd1ce4dabd3dbcb1949a66c97f449e7bcb0e940791d3782f7337b1" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.437039 4721 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f746721-5da3-4418-8ef6-d0b88f2121bc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.519647 4721 scope.go:117] "RemoveContainer" containerID="f6ced5c029fbb0ced62ee14010b08872173b1b7d00f61d4cb14f85035de775b1" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.636753 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1482d2e-b885-44bd-b679-109f0b9698ea" (UID: "a1482d2e-b885-44bd-b679-109f0b9698ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.649784 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-config-data" (OuterVolumeSpecName: "config-data") pod "a1482d2e-b885-44bd-b679-109f0b9698ea" (UID: "a1482d2e-b885-44bd-b679-109f0b9698ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.679582 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.679831 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1482d2e-b885-44bd-b679-109f0b9698ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.724847 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6c76b54b86-n9kln"] Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.790283 4721 scope.go:117] "RemoveContainer" containerID="4d53de923a6215191fa0ab435e823d19598bea6ec82ecee59ac252608725c58c" Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.798839 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7556fd87fb-z78lc"] Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.816205 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7556fd87fb-z78lc"] Feb 02 13:25:04 crc kubenswrapper[4721]: I0202 13:25:04.837190 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-69599c8b5f-rjs76"] Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:04.999049 4721 scope.go:117] "RemoveContainer" containerID="0b3cb07d111bcefe4cce6e8cdec0586b864539201a7ae3eef461e7791d1c123a" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.008380 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.031229 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.059436 4721 scope.go:117] "RemoveContainer" containerID="e7a1435ecc8600326126b2f72cf42852a2ff26324fb369f144ca54a8e7a02fe8" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.063104 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:05 crc kubenswrapper[4721]: E0202 13:25:05.064016 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="ceilometer-notification-agent" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064035 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="ceilometer-notification-agent" Feb 02 13:25:05 crc kubenswrapper[4721]: E0202 13:25:05.064050 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="sg-core" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064055 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="sg-core" Feb 02 13:25:05 crc kubenswrapper[4721]: E0202 13:25:05.064081 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="ceilometer-central-agent" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064087 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="ceilometer-central-agent" Feb 02 13:25:05 crc kubenswrapper[4721]: E0202 13:25:05.064116 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerName="neutron-api" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064122 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerName="neutron-api" Feb 02 13:25:05 crc kubenswrapper[4721]: E0202 13:25:05.064141 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="proxy-httpd" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064148 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="proxy-httpd" Feb 02 13:25:05 crc kubenswrapper[4721]: E0202 13:25:05.064161 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerName="neutron-httpd" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064167 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerName="neutron-httpd" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064383 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="ceilometer-central-agent" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064401 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="proxy-httpd" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064411 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerName="neutron-httpd" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064421 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="ceilometer-notification-agent" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064439 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f746721-5da3-4418-8ef6-d0b88f2121bc" containerName="neutron-api" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.064450 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" containerName="sg-core" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.067022 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.070844 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.071116 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.078253 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.206962 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-scripts\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.207133 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-run-httpd\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.207322 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.207357 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7n46\" (UniqueName: \"kubernetes.io/projected/679713b8-7e9b-4ccc-87f3-85afd17dc008-kube-api-access-r7n46\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.207484 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-config-data\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.207807 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-log-httpd\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.207890 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.266584 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlbxn" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" probeResult="failure" output=< Feb 02 13:25:05 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:25:05 crc kubenswrapper[4721]: > Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.310244 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-run-httpd\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.310362 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.310519 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7n46\" (UniqueName: \"kubernetes.io/projected/679713b8-7e9b-4ccc-87f3-85afd17dc008-kube-api-access-r7n46\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.310627 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-config-data\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.310776 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-log-httpd\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.310869 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.311013 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-scripts\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.311785 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-run-httpd\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.312491 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-log-httpd\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.315504 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.322706 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-config-data\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.327280 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-scripts\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.344658 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.357589 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7n46\" (UniqueName: \"kubernetes.io/projected/679713b8-7e9b-4ccc-87f3-85afd17dc008-kube-api-access-r7n46\") pod \"ceilometer-0\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.372234 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" event={"ID":"c5f7cb67-4d7c-4bc8-bf45-c949450206f0","Type":"ContainerStarted","Data":"622fe9b6176f587f10b3c6f779b6fa54763b7ccedb8408baf73c74843450fabf"} Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.395668 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-69599c8b5f-rjs76" event={"ID":"ce072a84-75da-4060-9c4a-d029b3a14947","Type":"ContainerStarted","Data":"824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062"} Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.395711 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-69599c8b5f-rjs76" event={"ID":"ce072a84-75da-4060-9c4a-d029b3a14947","Type":"ContainerStarted","Data":"c5aaadf4a39fbac2caf4ce2bae03ec472daa951b771c303d721f863e7147d5f2"} Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.396319 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.417662 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.429776 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"32729b18-a175-4abd-a8cf-392d318b64d8","Type":"ContainerStarted","Data":"3f17b6addea34db921e5e823921c2614b964aafd11d79b52c8d69de9bcab3c0d"} Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.489301 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-m87tn"] Feb 02 13:25:05 crc kubenswrapper[4721]: W0202 13:25:05.541679 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbddd12fa_0653_4199_867f_bfdf51350b39.slice/crio-0f9d2ecaa1c8c841d9801b80ea33e35c8c2a2c815bca71770000a6385ed7be15 WatchSource:0}: Error finding container 0f9d2ecaa1c8c841d9801b80ea33e35c8c2a2c815bca71770000a6385ed7be15: Status 404 returned error can't find the container with id 0f9d2ecaa1c8c841d9801b80ea33e35c8c2a2c815bca71770000a6385ed7be15 Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.588395 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-869db4994-hgxnh"] Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.606439 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-86974d69bd-t6gcz"] Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.612384 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-69599c8b5f-rjs76" podStartSLOduration=12.612366269 podStartE2EDuration="12.612366269s" podCreationTimestamp="2026-02-02 13:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:05.442586765 +0000 UTC m=+1445.745101154" watchObservedRunningTime="2026-02-02 13:25:05.612366269 +0000 UTC m=+1445.914880658" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.639972 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-59bff7fb85-wq6q5"] Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.650272 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.191018503 podStartE2EDuration="24.650246847s" podCreationTimestamp="2026-02-02 13:24:41 +0000 UTC" firstStartedPulling="2026-02-02 13:24:42.355710807 +0000 UTC m=+1422.658225186" lastFinishedPulling="2026-02-02 13:25:03.814939141 +0000 UTC m=+1444.117453530" observedRunningTime="2026-02-02 13:25:05.483979217 +0000 UTC m=+1445.786493626" watchObservedRunningTime="2026-02-02 13:25:05.650246847 +0000 UTC m=+1445.952761246" Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.665697 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-596786fd64-rpzql"] Feb 02 13:25:05 crc kubenswrapper[4721]: I0202 13:25:05.709770 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-64c55c4cc7-4htzp"] Feb 02 13:25:05 crc kubenswrapper[4721]: W0202 13:25:05.792749 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c23b064_e24b_4ab3_886d_d731004b7479.slice/crio-9ad18306781c46f28b10283fd0aff6494f3502186f08023b078948b6cce763c5 WatchSource:0}: Error finding container 9ad18306781c46f28b10283fd0aff6494f3502186f08023b078948b6cce763c5: Status 404 returned error can't find the container with id 9ad18306781c46f28b10283fd0aff6494f3502186f08023b078948b6cce763c5 Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.035212 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.142512 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-scripts\") pod \"a6048763-9be8-4530-b02a-78022c20d668\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.142544 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-combined-ca-bundle\") pod \"a6048763-9be8-4530-b02a-78022c20d668\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.142576 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6048763-9be8-4530-b02a-78022c20d668-etc-machine-id\") pod \"a6048763-9be8-4530-b02a-78022c20d668\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.142616 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data\") pod \"a6048763-9be8-4530-b02a-78022c20d668\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.142730 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data-custom\") pod \"a6048763-9be8-4530-b02a-78022c20d668\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.142841 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8k67\" (UniqueName: \"kubernetes.io/projected/a6048763-9be8-4530-b02a-78022c20d668-kube-api-access-m8k67\") pod \"a6048763-9be8-4530-b02a-78022c20d668\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.142986 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6048763-9be8-4530-b02a-78022c20d668-logs\") pod \"a6048763-9be8-4530-b02a-78022c20d668\" (UID: \"a6048763-9be8-4530-b02a-78022c20d668\") " Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.145093 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6048763-9be8-4530-b02a-78022c20d668-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a6048763-9be8-4530-b02a-78022c20d668" (UID: "a6048763-9be8-4530-b02a-78022c20d668"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.146058 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6048763-9be8-4530-b02a-78022c20d668-logs" (OuterVolumeSpecName: "logs") pod "a6048763-9be8-4530-b02a-78022c20d668" (UID: "a6048763-9be8-4530-b02a-78022c20d668"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.188141 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6048763-9be8-4530-b02a-78022c20d668-kube-api-access-m8k67" (OuterVolumeSpecName: "kube-api-access-m8k67") pod "a6048763-9be8-4530-b02a-78022c20d668" (UID: "a6048763-9be8-4530-b02a-78022c20d668"). InnerVolumeSpecName "kube-api-access-m8k67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.197206 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-scripts" (OuterVolumeSpecName: "scripts") pod "a6048763-9be8-4530-b02a-78022c20d668" (UID: "a6048763-9be8-4530-b02a-78022c20d668"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.234394 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a6048763-9be8-4530-b02a-78022c20d668" (UID: "a6048763-9be8-4530-b02a-78022c20d668"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.259248 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8k67\" (UniqueName: \"kubernetes.io/projected/a6048763-9be8-4530-b02a-78022c20d668-kube-api-access-m8k67\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.259310 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6048763-9be8-4530-b02a-78022c20d668-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.259325 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.259350 4721 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a6048763-9be8-4530-b02a-78022c20d668-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.259375 4721 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.332940 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.445686 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6048763-9be8-4530-b02a-78022c20d668" (UID: "a6048763-9be8-4530-b02a-78022c20d668"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.490523 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f746721-5da3-4418-8ef6-d0b88f2121bc" path="/var/lib/kubelet/pods/6f746721-5da3-4418-8ef6-d0b88f2121bc/volumes" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.492194 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1482d2e-b885-44bd-b679-109f0b9698ea" path="/var/lib/kubelet/pods/a1482d2e-b885-44bd-b679-109f0b9698ea/volumes" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.493791 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.511506 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data" (OuterVolumeSpecName: "config-data") pod "a6048763-9be8-4530-b02a-78022c20d668" (UID: "a6048763-9be8-4530-b02a-78022c20d668"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.521359 4721 generic.go:334] "Generic (PLEG): container finished" podID="a6048763-9be8-4530-b02a-78022c20d668" containerID="014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0" exitCode=137 Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.521755 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.596830 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6048763-9be8-4530-b02a-78022c20d668-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.611179 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-596786fd64-rpzql" podStartSLOduration=6.611109816 podStartE2EDuration="6.611109816s" podCreationTimestamp="2026-02-02 13:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:06.549428243 +0000 UTC m=+1446.851942632" watchObservedRunningTime="2026-02-02 13:25:06.611109816 +0000 UTC m=+1446.913624235" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687319 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-869db4994-hgxnh" event={"ID":"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8","Type":"ContainerStarted","Data":"b49739d0b82310eb7a88b3289808bff2fac1bf21a0b025f6531793460d8748fd"} Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687380 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687393 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" event={"ID":"5c23b064-e24b-4ab3-886d-d731004b7479","Type":"ContainerStarted","Data":"9ad18306781c46f28b10283fd0aff6494f3502186f08023b078948b6cce763c5"} Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687413 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" event={"ID":"a217ca40-3638-474b-b739-cb8784823fa6","Type":"ContainerStarted","Data":"855aab218dced6a9a20cd36dee1f3e920c647b6da54cc409503c35b4f9458f8e"} Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687426 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" event={"ID":"bddd12fa-0653-4199-867f-bfdf51350b39","Type":"ContainerStarted","Data":"0f9d2ecaa1c8c841d9801b80ea33e35c8c2a2c815bca71770000a6385ed7be15"} Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687438 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a6048763-9be8-4530-b02a-78022c20d668","Type":"ContainerDied","Data":"014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0"} Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687454 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a6048763-9be8-4530-b02a-78022c20d668","Type":"ContainerDied","Data":"722b50e74b8986a87887949a03c96fc3de9ed0d41b61df9c48e69c902703c27b"} Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687463 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-596786fd64-rpzql" event={"ID":"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c","Type":"ContainerStarted","Data":"aa5741d6d05ef4756b3d79fd25470a761119d83f3c7124c8d4e040ceaf9fb0b0"} Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687471 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-596786fd64-rpzql" event={"ID":"6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c","Type":"ContainerStarted","Data":"c28cbd79ea67fd089d525cb64965c78c717693cbb211141dd558ccdd5b91d1f8"} Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687479 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86974d69bd-t6gcz" event={"ID":"5d1412d5-76f7-4132-889d-f706432b3ecc","Type":"ContainerStarted","Data":"7896797b149f41786af0097adf6f640b7ee27ccb8c123187623a348336549aa9"} Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.687496 4721 scope.go:117] "RemoveContainer" containerID="014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.740919 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.754056 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.772279 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 13:25:06 crc kubenswrapper[4721]: E0202 13:25:06.772857 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6048763-9be8-4530-b02a-78022c20d668" containerName="cinder-api" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.772882 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6048763-9be8-4530-b02a-78022c20d668" containerName="cinder-api" Feb 02 13:25:06 crc kubenswrapper[4721]: E0202 13:25:06.772913 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6048763-9be8-4530-b02a-78022c20d668" containerName="cinder-api-log" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.772924 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6048763-9be8-4530-b02a-78022c20d668" containerName="cinder-api-log" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.789917 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6048763-9be8-4530-b02a-78022c20d668" containerName="cinder-api-log" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.789980 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6048763-9be8-4530-b02a-78022c20d668" containerName="cinder-api" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.800639 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.800777 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.804337 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.804577 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.807501 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.909570 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djkzn\" (UniqueName: \"kubernetes.io/projected/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-kube-api-access-djkzn\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.909643 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-config-data\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.909700 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.909726 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-config-data-custom\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.909844 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.909892 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.909966 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-logs\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.910272 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-scripts\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:06 crc kubenswrapper[4721]: I0202 13:25:06.910379 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.012848 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-config-data\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.012913 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.012949 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-config-data-custom\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.012991 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.013019 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.013079 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-logs\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.013150 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.014055 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-scripts\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.014166 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.014377 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djkzn\" (UniqueName: \"kubernetes.io/projected/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-kube-api-access-djkzn\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.014575 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-logs\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.020898 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-scripts\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.021608 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.025549 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.035853 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-config-data-custom\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.036954 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.059833 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djkzn\" (UniqueName: \"kubernetes.io/projected/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-kube-api-access-djkzn\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.063057 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eabfa0b-0304-4eda-8f8a-dc9160569e4b-config-data\") pod \"cinder-api-0\" (UID: \"0eabfa0b-0304-4eda-8f8a-dc9160569e4b\") " pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.135945 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.552866 4721 generic.go:334] "Generic (PLEG): container finished" podID="a217ca40-3638-474b-b739-cb8784823fa6" containerID="bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed" exitCode=0 Feb 02 13:25:07 crc kubenswrapper[4721]: I0202 13:25:07.554709 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" event={"ID":"a217ca40-3638-474b-b739-cb8784823fa6","Type":"ContainerDied","Data":"bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed"} Feb 02 13:25:07 crc kubenswrapper[4721]: W0202 13:25:07.815957 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod679713b8_7e9b_4ccc_87f3_85afd17dc008.slice/crio-6f08f3fdbee7bda0664d6d60e3f3f3c62d78c95caee038e2d39526bac0095990 WatchSource:0}: Error finding container 6f08f3fdbee7bda0664d6d60e3f3f3c62d78c95caee038e2d39526bac0095990: Status 404 returned error can't find the container with id 6f08f3fdbee7bda0664d6d60e3f3f3c62d78c95caee038e2d39526bac0095990 Feb 02 13:25:08 crc kubenswrapper[4721]: I0202 13:25:08.427310 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6048763-9be8-4530-b02a-78022c20d668" path="/var/lib/kubelet/pods/a6048763-9be8-4530-b02a-78022c20d668/volumes" Feb 02 13:25:08 crc kubenswrapper[4721]: I0202 13:25:08.497958 4721 scope.go:117] "RemoveContainer" containerID="e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681" Feb 02 13:25:08 crc kubenswrapper[4721]: I0202 13:25:08.592288 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerStarted","Data":"6f08f3fdbee7bda0664d6d60e3f3f3c62d78c95caee038e2d39526bac0095990"} Feb 02 13:25:08 crc kubenswrapper[4721]: I0202 13:25:08.615890 4721 scope.go:117] "RemoveContainer" containerID="014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0" Feb 02 13:25:08 crc kubenswrapper[4721]: E0202 13:25:08.617229 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0\": container with ID starting with 014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0 not found: ID does not exist" containerID="014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0" Feb 02 13:25:08 crc kubenswrapper[4721]: I0202 13:25:08.617275 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0"} err="failed to get container status \"014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0\": rpc error: code = NotFound desc = could not find container \"014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0\": container with ID starting with 014d914a5ce67334b42e317ebe6baa31bca4aa273d11d84a4123f675577064a0 not found: ID does not exist" Feb 02 13:25:08 crc kubenswrapper[4721]: I0202 13:25:08.617303 4721 scope.go:117] "RemoveContainer" containerID="e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681" Feb 02 13:25:08 crc kubenswrapper[4721]: E0202 13:25:08.617631 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681\": container with ID starting with e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681 not found: ID does not exist" containerID="e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681" Feb 02 13:25:08 crc kubenswrapper[4721]: I0202 13:25:08.617655 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681"} err="failed to get container status \"e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681\": rpc error: code = NotFound desc = could not find container \"e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681\": container with ID starting with e2b0754f177aa2fb457d07e7ccb63cf3d2db89f9bc5c8b27f274b7d5e64b1681 not found: ID does not exist" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.133406 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.616505 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" event={"ID":"5c23b064-e24b-4ab3-886d-d731004b7479","Type":"ContainerStarted","Data":"53fda4e0de936c98aab52cbdc7d3f337c88ae4716d61fd71055f35f57f0a8272"} Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.616874 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.621868 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" event={"ID":"a217ca40-3638-474b-b739-cb8784823fa6","Type":"ContainerStarted","Data":"02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088"} Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.622779 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.626404 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" event={"ID":"bddd12fa-0653-4199-867f-bfdf51350b39","Type":"ContainerStarted","Data":"c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75"} Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.626539 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" podUID="bddd12fa-0653-4199-867f-bfdf51350b39" containerName="heat-cfnapi" containerID="cri-o://c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75" gracePeriod=60 Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.626553 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.652049 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" podStartSLOduration=4.74915499 podStartE2EDuration="7.652034129s" podCreationTimestamp="2026-02-02 13:25:02 +0000 UTC" firstStartedPulling="2026-02-02 13:25:05.801026146 +0000 UTC m=+1446.103540535" lastFinishedPulling="2026-02-02 13:25:08.703905285 +0000 UTC m=+1449.006419674" observedRunningTime="2026-02-02 13:25:09.64990136 +0000 UTC m=+1449.952415769" watchObservedRunningTime="2026-02-02 13:25:09.652034129 +0000 UTC m=+1449.954548518" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.652162 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.690472 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" event={"ID":"c5f7cb67-4d7c-4bc8-bf45-c949450206f0","Type":"ContainerStarted","Data":"cacfe9dad3c5ac1611cb10a831ba529e6821a0590168b211ab46225955ae992f"} Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.692140 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.711154 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" podStartSLOduration=13.614440355 podStartE2EDuration="16.711131631s" podCreationTimestamp="2026-02-02 13:24:53 +0000 UTC" firstStartedPulling="2026-02-02 13:25:05.597898456 +0000 UTC m=+1445.900412845" lastFinishedPulling="2026-02-02 13:25:08.694589732 +0000 UTC m=+1448.997104121" observedRunningTime="2026-02-02 13:25:09.69118353 +0000 UTC m=+1449.993697919" watchObservedRunningTime="2026-02-02 13:25:09.711131631 +0000 UTC m=+1450.013646040" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.723081 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0eabfa0b-0304-4eda-8f8a-dc9160569e4b","Type":"ContainerStarted","Data":"6d51b976af288be98113ba87064c5de10cd72146e96a3af7a510329a6991fa32"} Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.778604 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" podStartSLOduration=16.77858249 podStartE2EDuration="16.77858249s" podCreationTimestamp="2026-02-02 13:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:09.723388174 +0000 UTC m=+1450.025902563" watchObservedRunningTime="2026-02-02 13:25:09.77858249 +0000 UTC m=+1450.081096889" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.842855 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" podStartSLOduration=5.986481586 podStartE2EDuration="9.842831663s" podCreationTimestamp="2026-02-02 13:25:00 +0000 UTC" firstStartedPulling="2026-02-02 13:25:04.79051428 +0000 UTC m=+1445.093028669" lastFinishedPulling="2026-02-02 13:25:08.646864357 +0000 UTC m=+1448.949378746" observedRunningTime="2026-02-02 13:25:09.766100192 +0000 UTC m=+1450.068614581" watchObservedRunningTime="2026-02-02 13:25:09.842831663 +0000 UTC m=+1450.145346052" Feb 02 13:25:09 crc kubenswrapper[4721]: I0202 13:25:09.846105 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-869db4994-hgxnh" podStartSLOduration=6.657639128 podStartE2EDuration="9.846094092s" podCreationTimestamp="2026-02-02 13:25:00 +0000 UTC" firstStartedPulling="2026-02-02 13:25:05.513522838 +0000 UTC m=+1445.816037227" lastFinishedPulling="2026-02-02 13:25:08.701977802 +0000 UTC m=+1449.004492191" observedRunningTime="2026-02-02 13:25:09.794513313 +0000 UTC m=+1450.097027692" watchObservedRunningTime="2026-02-02 13:25:09.846094092 +0000 UTC m=+1450.148608501" Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.739649 4721 generic.go:334] "Generic (PLEG): container finished" podID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" containerID="d6a7a5707b26d69fb1863e451668fcdd388419479b4005bb1d55054ada2fb366" exitCode=1 Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.739704 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-869db4994-hgxnh" event={"ID":"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8","Type":"ContainerDied","Data":"d6a7a5707b26d69fb1863e451668fcdd388419479b4005bb1d55054ada2fb366"} Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.740779 4721 scope.go:117] "RemoveContainer" containerID="d6a7a5707b26d69fb1863e451668fcdd388419479b4005bb1d55054ada2fb366" Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.745704 4721 generic.go:334] "Generic (PLEG): container finished" podID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" containerID="cacfe9dad3c5ac1611cb10a831ba529e6821a0590168b211ab46225955ae992f" exitCode=1 Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.745779 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" event={"ID":"c5f7cb67-4d7c-4bc8-bf45-c949450206f0","Type":"ContainerDied","Data":"cacfe9dad3c5ac1611cb10a831ba529e6821a0590168b211ab46225955ae992f"} Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.746672 4721 scope.go:117] "RemoveContainer" containerID="cacfe9dad3c5ac1611cb10a831ba529e6821a0590168b211ab46225955ae992f" Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.756623 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8588ddc4dc-rq722" event={"ID":"ee4f36c4-39c4-4cb4-b24c-676b76966752","Type":"ContainerStarted","Data":"fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2"} Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.756803 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-8588ddc4dc-rq722" podUID="ee4f36c4-39c4-4cb4-b24c-676b76966752" containerName="heat-api" containerID="cri-o://fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2" gracePeriod=60 Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.757126 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.790080 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerStarted","Data":"22439dbfbc2d5b9a7aaa38b029bb7c102abafe93d1e9397f189c54e9d9a93d8a"} Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.831162 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86974d69bd-t6gcz" event={"ID":"5d1412d5-76f7-4132-889d-f706432b3ecc","Type":"ContainerStarted","Data":"525d4f77f841cd0b1218431404c5fc5cc758ca5f6ff2e06ab4757a3b9c618d22"} Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.832128 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.833381 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-8588ddc4dc-rq722" podStartSLOduration=12.869659607 podStartE2EDuration="17.833361098s" podCreationTimestamp="2026-02-02 13:24:53 +0000 UTC" firstStartedPulling="2026-02-02 13:25:03.675973972 +0000 UTC m=+1443.978488361" lastFinishedPulling="2026-02-02 13:25:08.639675463 +0000 UTC m=+1448.942189852" observedRunningTime="2026-02-02 13:25:10.814392213 +0000 UTC m=+1451.116906632" watchObservedRunningTime="2026-02-02 13:25:10.833361098 +0000 UTC m=+1451.135875487" Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.869502 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-86974d69bd-t6gcz" podStartSLOduration=5.7635264809999995 podStartE2EDuration="8.869478667s" podCreationTimestamp="2026-02-02 13:25:02 +0000 UTC" firstStartedPulling="2026-02-02 13:25:05.542312589 +0000 UTC m=+1445.844826978" lastFinishedPulling="2026-02-02 13:25:08.648264775 +0000 UTC m=+1448.950779164" observedRunningTime="2026-02-02 13:25:10.857324868 +0000 UTC m=+1451.159839257" watchObservedRunningTime="2026-02-02 13:25:10.869478667 +0000 UTC m=+1451.171993046" Feb 02 13:25:10 crc kubenswrapper[4721]: I0202 13:25:10.927180 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.008190 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.800200 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.841926 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerStarted","Data":"f207fef2f76230ae29ed92563bfe6192bf4e5751fb81be7d8bc1400d4d42e087"} Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.845357 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-869db4994-hgxnh" event={"ID":"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8","Type":"ContainerStarted","Data":"9892fb62f9251fa656b60a855b779d0ef03617e58fee62707134474322dbab8a"} Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.845485 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.849256 4721 generic.go:334] "Generic (PLEG): container finished" podID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" containerID="fbc9c66b85258c4a25c9415a67c1a7b4969a3f4e6fb178430c5d1cfa81c475d3" exitCode=1 Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.849332 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" event={"ID":"c5f7cb67-4d7c-4bc8-bf45-c949450206f0","Type":"ContainerDied","Data":"fbc9c66b85258c4a25c9415a67c1a7b4969a3f4e6fb178430c5d1cfa81c475d3"} Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.849372 4721 scope.go:117] "RemoveContainer" containerID="cacfe9dad3c5ac1611cb10a831ba529e6821a0590168b211ab46225955ae992f" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.850199 4721 scope.go:117] "RemoveContainer" containerID="fbc9c66b85258c4a25c9415a67c1a7b4969a3f4e6fb178430c5d1cfa81c475d3" Feb 02 13:25:11 crc kubenswrapper[4721]: E0202 13:25:11.850507 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6c76b54b86-n9kln_openstack(c5f7cb67-4d7c-4bc8-bf45-c949450206f0)\"" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.856826 4721 generic.go:334] "Generic (PLEG): container finished" podID="ee4f36c4-39c4-4cb4-b24c-676b76966752" containerID="fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2" exitCode=0 Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.856914 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8588ddc4dc-rq722" event={"ID":"ee4f36c4-39c4-4cb4-b24c-676b76966752","Type":"ContainerDied","Data":"fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2"} Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.857259 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-8588ddc4dc-rq722" event={"ID":"ee4f36c4-39c4-4cb4-b24c-676b76966752","Type":"ContainerDied","Data":"22a62ef40c5e00afd66630e4761effc79fe1d27fa65d0590da25c3c1d43ad9bf"} Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.856933 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-8588ddc4dc-rq722" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.857774 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cwgk\" (UniqueName: \"kubernetes.io/projected/ee4f36c4-39c4-4cb4-b24c-676b76966752-kube-api-access-9cwgk\") pod \"ee4f36c4-39c4-4cb4-b24c-676b76966752\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.857914 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data\") pod \"ee4f36c4-39c4-4cb4-b24c-676b76966752\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.858097 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data-custom\") pod \"ee4f36c4-39c4-4cb4-b24c-676b76966752\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.858181 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-combined-ca-bundle\") pod \"ee4f36c4-39c4-4cb4-b24c-676b76966752\" (UID: \"ee4f36c4-39c4-4cb4-b24c-676b76966752\") " Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.870760 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0eabfa0b-0304-4eda-8f8a-dc9160569e4b","Type":"ContainerStarted","Data":"fe66034a92edf80281d3c1b9d4b4c7c688f32d48b13fd5da1326e2459fea155d"} Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.870803 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0eabfa0b-0304-4eda-8f8a-dc9160569e4b","Type":"ContainerStarted","Data":"bba10c67506fd0369bb5fd8b56802fa27fb210d2c301daa9dcdf6f4098bdaff1"} Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.871826 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.923056 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee4f36c4-39c4-4cb4-b24c-676b76966752-kube-api-access-9cwgk" (OuterVolumeSpecName: "kube-api-access-9cwgk") pod "ee4f36c4-39c4-4cb4-b24c-676b76966752" (UID: "ee4f36c4-39c4-4cb4-b24c-676b76966752"). InnerVolumeSpecName "kube-api-access-9cwgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.926363 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ee4f36c4-39c4-4cb4-b24c-676b76966752" (UID: "ee4f36c4-39c4-4cb4-b24c-676b76966752"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.964856 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cwgk\" (UniqueName: \"kubernetes.io/projected/ee4f36c4-39c4-4cb4-b24c-676b76966752-kube-api-access-9cwgk\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.964901 4721 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:11 crc kubenswrapper[4721]: I0202 13:25:11.983433 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data" (OuterVolumeSpecName: "config-data") pod "ee4f36c4-39c4-4cb4-b24c-676b76966752" (UID: "ee4f36c4-39c4-4cb4-b24c-676b76966752"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.022854 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.022837887 podStartE2EDuration="6.022837887s" podCreationTimestamp="2026-02-02 13:25:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:12.018307565 +0000 UTC m=+1452.320821964" watchObservedRunningTime="2026-02-02 13:25:12.022837887 +0000 UTC m=+1452.325352266" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.060812 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee4f36c4-39c4-4cb4-b24c-676b76966752" (UID: "ee4f36c4-39c4-4cb4-b24c-676b76966752"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.066999 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.067040 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee4f36c4-39c4-4cb4-b24c-676b76966752-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.252839 4721 scope.go:117] "RemoveContainer" containerID="fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.256341 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-8588ddc4dc-rq722"] Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.267307 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-8588ddc4dc-rq722"] Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.299892 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.320673 4721 scope.go:117] "RemoveContainer" containerID="fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2" Feb 02 13:25:12 crc kubenswrapper[4721]: E0202 13:25:12.321227 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2\": container with ID starting with fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2 not found: ID does not exist" containerID="fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.321270 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2"} err="failed to get container status \"fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2\": rpc error: code = NotFound desc = could not find container \"fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2\": container with ID starting with fa800e1436f00d168fdcf5bb1372a6b44ff3f3a66add3f54482b3057f2c14bd2 not found: ID does not exist" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.425267 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee4f36c4-39c4-4cb4-b24c-676b76966752" path="/var/lib/kubelet/pods/ee4f36c4-39c4-4cb4-b24c-676b76966752/volumes" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.893864 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerStarted","Data":"3ad385a2be5227562f34db7d77851850b136530db655dfc87e87234604886068"} Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.896663 4721 generic.go:334] "Generic (PLEG): container finished" podID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" containerID="9892fb62f9251fa656b60a855b779d0ef03617e58fee62707134474322dbab8a" exitCode=1 Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.896753 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-869db4994-hgxnh" event={"ID":"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8","Type":"ContainerDied","Data":"9892fb62f9251fa656b60a855b779d0ef03617e58fee62707134474322dbab8a"} Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.896793 4721 scope.go:117] "RemoveContainer" containerID="d6a7a5707b26d69fb1863e451668fcdd388419479b4005bb1d55054ada2fb366" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.897423 4721 scope.go:117] "RemoveContainer" containerID="9892fb62f9251fa656b60a855b779d0ef03617e58fee62707134474322dbab8a" Feb 02 13:25:12 crc kubenswrapper[4721]: E0202 13:25:12.897788 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-869db4994-hgxnh_openstack(2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8)\"" pod="openstack/heat-api-869db4994-hgxnh" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" Feb 02 13:25:12 crc kubenswrapper[4721]: I0202 13:25:12.906816 4721 scope.go:117] "RemoveContainer" containerID="fbc9c66b85258c4a25c9415a67c1a7b4969a3f4e6fb178430c5d1cfa81c475d3" Feb 02 13:25:12 crc kubenswrapper[4721]: E0202 13:25:12.907104 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6c76b54b86-n9kln_openstack(c5f7cb67-4d7c-4bc8-bf45-c949450206f0)\"" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" Feb 02 13:25:13 crc kubenswrapper[4721]: I0202 13:25:13.815111 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:25:13 crc kubenswrapper[4721]: I0202 13:25:13.824162 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerName="glance-httpd" containerID="cri-o://e352ddf5546d205ff44a3674807cce6288e4d9d2631c1e97a5b81f79535b48d6" gracePeriod=30 Feb 02 13:25:13 crc kubenswrapper[4721]: I0202 13:25:13.824554 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerName="glance-log" containerID="cri-o://60d5f5c74197287dc48184ed76b8f6d34b4d23a85669a6b5c075668ede91ce31" gracePeriod=30 Feb 02 13:25:13 crc kubenswrapper[4721]: I0202 13:25:13.918821 4721 scope.go:117] "RemoveContainer" containerID="9892fb62f9251fa656b60a855b779d0ef03617e58fee62707134474322dbab8a" Feb 02 13:25:13 crc kubenswrapper[4721]: E0202 13:25:13.920047 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-869db4994-hgxnh_openstack(2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8)\"" pod="openstack/heat-api-869db4994-hgxnh" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.137108 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.228652 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9czmj"] Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.228942 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" podUID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" containerName="dnsmasq-dns" containerID="cri-o://79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3" gracePeriod=10 Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.764841 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.765200 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.880371 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.939343 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerStarted","Data":"ab6eff23a607854558fcb2aa7cbae46f97306ff156928f4f08c74867721f4d2c"} Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.939558 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="ceilometer-central-agent" containerID="cri-o://22439dbfbc2d5b9a7aaa38b029bb7c102abafe93d1e9397f189c54e9d9a93d8a" gracePeriod=30 Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.939853 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.940287 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="proxy-httpd" containerID="cri-o://ab6eff23a607854558fcb2aa7cbae46f97306ff156928f4f08c74867721f4d2c" gracePeriod=30 Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.940354 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="sg-core" containerID="cri-o://3ad385a2be5227562f34db7d77851850b136530db655dfc87e87234604886068" gracePeriod=30 Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.940397 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="ceilometer-notification-agent" containerID="cri-o://f207fef2f76230ae29ed92563bfe6192bf4e5751fb81be7d8bc1400d4d42e087" gracePeriod=30 Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.946768 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-config\") pod \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.947360 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-nb\") pod \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.947454 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-sb\") pod \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.947678 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-swift-storage-0\") pod \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.947696 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kpkt\" (UniqueName: \"kubernetes.io/projected/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-kube-api-access-5kpkt\") pod \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.948436 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-svc\") pod \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\" (UID: \"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f\") " Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.967664 4721 generic.go:334] "Generic (PLEG): container finished" podID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerID="60d5f5c74197287dc48184ed76b8f6d34b4d23a85669a6b5c075668ede91ce31" exitCode=143 Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.967771 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cdd3f19-3e66-4807-a0e8-957c713cef36","Type":"ContainerDied","Data":"60d5f5c74197287dc48184ed76b8f6d34b4d23a85669a6b5c075668ede91ce31"} Feb 02 13:25:14 crc kubenswrapper[4721]: I0202 13:25:14.976773 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.398073452 podStartE2EDuration="10.976744901s" podCreationTimestamp="2026-02-02 13:25:04 +0000 UTC" firstStartedPulling="2026-02-02 13:25:08.391728218 +0000 UTC m=+1448.694242607" lastFinishedPulling="2026-02-02 13:25:13.970399667 +0000 UTC m=+1454.272914056" observedRunningTime="2026-02-02 13:25:14.970878722 +0000 UTC m=+1455.273393121" watchObservedRunningTime="2026-02-02 13:25:14.976744901 +0000 UTC m=+1455.279259290" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.001955 4721 generic.go:334] "Generic (PLEG): container finished" podID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" containerID="79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3" exitCode=0 Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.001986 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.002639 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" event={"ID":"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f","Type":"ContainerDied","Data":"79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3"} Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.002771 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" event={"ID":"f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f","Type":"ContainerDied","Data":"0e5ec1ece5d46afcb7f1cd43125f924d0c21a100ba9e55bfc9c2436d4917aef7"} Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.002846 4721 scope.go:117] "RemoveContainer" containerID="79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.003316 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-kube-api-access-5kpkt" (OuterVolumeSpecName: "kube-api-access-5kpkt") pod "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" (UID: "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f"). InnerVolumeSpecName "kube-api-access-5kpkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.046618 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" (UID: "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.064824 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kpkt\" (UniqueName: \"kubernetes.io/projected/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-kube-api-access-5kpkt\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.064863 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.079502 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" (UID: "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.086200 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlbxn" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" probeResult="failure" output=< Feb 02 13:25:15 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:25:15 crc kubenswrapper[4721]: > Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.115684 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" (UID: "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.121595 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" (UID: "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.138714 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-config" (OuterVolumeSpecName: "config") pod "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" (UID: "f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.166540 4721 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.166578 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.166587 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.166595 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.279776 4721 scope.go:117] "RemoveContainer" containerID="4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.304309 4721 scope.go:117] "RemoveContainer" containerID="79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3" Feb 02 13:25:15 crc kubenswrapper[4721]: E0202 13:25:15.304780 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3\": container with ID starting with 79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3 not found: ID does not exist" containerID="79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.304827 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3"} err="failed to get container status \"79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3\": rpc error: code = NotFound desc = could not find container \"79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3\": container with ID starting with 79b4d95f7f9db98cc9b0a4ae6c82ff501b94e112bfc5f0a0bf15dd2adbe2a2a3 not found: ID does not exist" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.304847 4721 scope.go:117] "RemoveContainer" containerID="4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505" Feb 02 13:25:15 crc kubenswrapper[4721]: E0202 13:25:15.305241 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505\": container with ID starting with 4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505 not found: ID does not exist" containerID="4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.305291 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505"} err="failed to get container status \"4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505\": rpc error: code = NotFound desc = could not find container \"4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505\": container with ID starting with 4cdfecffc79f466040cbdb103a7f0aa0881b7f82c12764f848d66d7d7434f505 not found: ID does not exist" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.359295 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9czmj"] Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.379430 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-9czmj"] Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.417026 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.417273 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerName="glance-log" containerID="cri-o://6b2f4ace0eb75cc9899716f3900092d7c461a3838ae11f1c1f66ca5572586d0f" gracePeriod=30 Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.417372 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerName="glance-httpd" containerID="cri-o://8d37382e602df793413e2180a90609111293c970770dc8adb79b0513cd945df9" gracePeriod=30 Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.927460 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:15 crc kubenswrapper[4721]: I0202 13:25:15.928882 4721 scope.go:117] "RemoveContainer" containerID="9892fb62f9251fa656b60a855b779d0ef03617e58fee62707134474322dbab8a" Feb 02 13:25:15 crc kubenswrapper[4721]: E0202 13:25:15.929224 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-869db4994-hgxnh_openstack(2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8)\"" pod="openstack/heat-api-869db4994-hgxnh" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.005377 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.006651 4721 scope.go:117] "RemoveContainer" containerID="fbc9c66b85258c4a25c9415a67c1a7b4969a3f4e6fb178430c5d1cfa81c475d3" Feb 02 13:25:16 crc kubenswrapper[4721]: E0202 13:25:16.006948 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6c76b54b86-n9kln_openstack(c5f7cb67-4d7c-4bc8-bf45-c949450206f0)\"" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.010298 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.043359 4721 generic.go:334] "Generic (PLEG): container finished" podID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerID="ab6eff23a607854558fcb2aa7cbae46f97306ff156928f4f08c74867721f4d2c" exitCode=0 Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.043393 4721 generic.go:334] "Generic (PLEG): container finished" podID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerID="3ad385a2be5227562f34db7d77851850b136530db655dfc87e87234604886068" exitCode=2 Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.043404 4721 generic.go:334] "Generic (PLEG): container finished" podID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerID="f207fef2f76230ae29ed92563bfe6192bf4e5751fb81be7d8bc1400d4d42e087" exitCode=0 Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.043469 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerDied","Data":"ab6eff23a607854558fcb2aa7cbae46f97306ff156928f4f08c74867721f4d2c"} Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.043504 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerDied","Data":"3ad385a2be5227562f34db7d77851850b136530db655dfc87e87234604886068"} Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.043513 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerDied","Data":"f207fef2f76230ae29ed92563bfe6192bf4e5751fb81be7d8bc1400d4d42e087"} Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.050446 4721 generic.go:334] "Generic (PLEG): container finished" podID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerID="6b2f4ace0eb75cc9899716f3900092d7c461a3838ae11f1c1f66ca5572586d0f" exitCode=143 Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.050515 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8075cf6d-3ae0-468e-98cb-5f341d78b8ac","Type":"ContainerDied","Data":"6b2f4ace0eb75cc9899716f3900092d7c461a3838ae11f1c1f66ca5572586d0f"} Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.054383 4721 scope.go:117] "RemoveContainer" containerID="fbc9c66b85258c4a25c9415a67c1a7b4969a3f4e6fb178430c5d1cfa81c475d3" Feb 02 13:25:16 crc kubenswrapper[4721]: E0202 13:25:16.054701 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6c76b54b86-n9kln_openstack(c5f7cb67-4d7c-4bc8-bf45-c949450206f0)\"" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" Feb 02 13:25:16 crc kubenswrapper[4721]: I0202 13:25:16.425805 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" path="/var/lib/kubelet/pods/f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f/volumes" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.004637 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.072125 4721 generic.go:334] "Generic (PLEG): container finished" podID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerID="e352ddf5546d205ff44a3674807cce6288e4d9d2631c1e97a5b81f79535b48d6" exitCode=0 Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.072333 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cdd3f19-3e66-4807-a0e8-957c713cef36","Type":"ContainerDied","Data":"e352ddf5546d205ff44a3674807cce6288e4d9d2631c1e97a5b81f79535b48d6"} Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.698367 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.837864 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-httpd-run\") pod \"7cdd3f19-3e66-4807-a0e8-957c713cef36\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.837937 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-logs\") pod \"7cdd3f19-3e66-4807-a0e8-957c713cef36\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.838000 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-config-data\") pod \"7cdd3f19-3e66-4807-a0e8-957c713cef36\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.843210 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"7cdd3f19-3e66-4807-a0e8-957c713cef36\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.843294 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-public-tls-certs\") pod \"7cdd3f19-3e66-4807-a0e8-957c713cef36\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.843527 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-scripts\") pod \"7cdd3f19-3e66-4807-a0e8-957c713cef36\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.843552 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpqt2\" (UniqueName: \"kubernetes.io/projected/7cdd3f19-3e66-4807-a0e8-957c713cef36-kube-api-access-vpqt2\") pod \"7cdd3f19-3e66-4807-a0e8-957c713cef36\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.843699 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-combined-ca-bundle\") pod \"7cdd3f19-3e66-4807-a0e8-957c713cef36\" (UID: \"7cdd3f19-3e66-4807-a0e8-957c713cef36\") " Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.843738 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7cdd3f19-3e66-4807-a0e8-957c713cef36" (UID: "7cdd3f19-3e66-4807-a0e8-957c713cef36"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.845482 4721 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.846002 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-logs" (OuterVolumeSpecName: "logs") pod "7cdd3f19-3e66-4807-a0e8-957c713cef36" (UID: "7cdd3f19-3e66-4807-a0e8-957c713cef36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.884514 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cdd3f19-3e66-4807-a0e8-957c713cef36-kube-api-access-vpqt2" (OuterVolumeSpecName: "kube-api-access-vpqt2") pod "7cdd3f19-3e66-4807-a0e8-957c713cef36" (UID: "7cdd3f19-3e66-4807-a0e8-957c713cef36"). InnerVolumeSpecName "kube-api-access-vpqt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.894330 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-scripts" (OuterVolumeSpecName: "scripts") pod "7cdd3f19-3e66-4807-a0e8-957c713cef36" (UID: "7cdd3f19-3e66-4807-a0e8-957c713cef36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.948164 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdd3f19-3e66-4807-a0e8-957c713cef36-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.948595 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.948682 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpqt2\" (UniqueName: \"kubernetes.io/projected/7cdd3f19-3e66-4807-a0e8-957c713cef36-kube-api-access-vpqt2\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:17 crc kubenswrapper[4721]: I0202 13:25:17.975308 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cdd3f19-3e66-4807-a0e8-957c713cef36" (UID: "7cdd3f19-3e66-4807-a0e8-957c713cef36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.058146 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.165426 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cdd3f19-3e66-4807-a0e8-957c713cef36","Type":"ContainerDied","Data":"57b27fa9d3d382916d0fc10e2c5127febea2102b5b13a8f86a12bdcd66a461e9"} Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.165488 4721 scope.go:117] "RemoveContainer" containerID="e352ddf5546d205ff44a3674807cce6288e4d9d2631c1e97a5b81f79535b48d6" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.165674 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.188981 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-config-data" (OuterVolumeSpecName: "config-data") pod "7cdd3f19-3e66-4807-a0e8-957c713cef36" (UID: "7cdd3f19-3e66-4807-a0e8-957c713cef36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.209616 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663" (OuterVolumeSpecName: "glance") pod "7cdd3f19-3e66-4807-a0e8-957c713cef36" (UID: "7cdd3f19-3e66-4807-a0e8-957c713cef36"). InnerVolumeSpecName "pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.231252 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7cdd3f19-3e66-4807-a0e8-957c713cef36" (UID: "7cdd3f19-3e66-4807-a0e8-957c713cef36"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.272577 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.272657 4721 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") on node \"crc\" " Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.272672 4721 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cdd3f19-3e66-4807-a0e8-957c713cef36-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.333569 4721 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.333751 4721 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663") on node "crc" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.374471 4721 reconciler_common.go:293] "Volume detached for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.399126 4721 scope.go:117] "RemoveContainer" containerID="60d5f5c74197287dc48184ed76b8f6d34b4d23a85669a6b5c075668ede91ce31" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.510437 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.528480 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.542274 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:25:18 crc kubenswrapper[4721]: E0202 13:25:18.543103 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" containerName="init" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.543119 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" containerName="init" Feb 02 13:25:18 crc kubenswrapper[4721]: E0202 13:25:18.543149 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4f36c4-39c4-4cb4-b24c-676b76966752" containerName="heat-api" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.543157 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4f36c4-39c4-4cb4-b24c-676b76966752" containerName="heat-api" Feb 02 13:25:18 crc kubenswrapper[4721]: E0202 13:25:18.543167 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerName="glance-httpd" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.543173 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerName="glance-httpd" Feb 02 13:25:18 crc kubenswrapper[4721]: E0202 13:25:18.543192 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" containerName="dnsmasq-dns" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.543197 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" containerName="dnsmasq-dns" Feb 02 13:25:18 crc kubenswrapper[4721]: E0202 13:25:18.543213 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerName="glance-log" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.543219 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerName="glance-log" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.543466 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerName="glance-httpd" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.543492 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cdd3f19-3e66-4807-a0e8-957c713cef36" containerName="glance-log" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.543515 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" containerName="dnsmasq-dns" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.543539 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee4f36c4-39c4-4cb4-b24c-676b76966752" containerName="heat-api" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.545034 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.547873 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.548041 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.567308 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.687148 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.687244 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-config-data\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.687293 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.687336 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89lj5\" (UniqueName: \"kubernetes.io/projected/23e9328e-fd9a-4a87-946b-2c46e25bea51-kube-api-access-89lj5\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.687368 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-scripts\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.687492 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.687612 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23e9328e-fd9a-4a87-946b-2c46e25bea51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.687669 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e9328e-fd9a-4a87-946b-2c46e25bea51-logs\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.790541 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e9328e-fd9a-4a87-946b-2c46e25bea51-logs\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.790638 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.790719 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-config-data\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.790763 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.790801 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89lj5\" (UniqueName: \"kubernetes.io/projected/23e9328e-fd9a-4a87-946b-2c46e25bea51-kube-api-access-89lj5\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.790841 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-scripts\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.790984 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.791166 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23e9328e-fd9a-4a87-946b-2c46e25bea51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.794162 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23e9328e-fd9a-4a87-946b-2c46e25bea51-logs\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.796029 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23e9328e-fd9a-4a87-946b-2c46e25bea51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.800338 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.800359 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.801025 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-config-data\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.801028 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23e9328e-fd9a-4a87-946b-2c46e25bea51-scripts\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.804178 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.804227 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c7812605d9919b226d4340fce797cd8fb18c9c948d1e68864aa7eb7aeecf4816/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.818858 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89lj5\" (UniqueName: \"kubernetes.io/projected/23e9328e-fd9a-4a87-946b-2c46e25bea51-kube-api-access-89lj5\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:18 crc kubenswrapper[4721]: I0202 13:25:18.963678 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-41e23a11-a3eb-45e5-a6c7-a5d94d094663\") pod \"glance-default-external-api-0\" (UID: \"23e9328e-fd9a-4a87-946b-2c46e25bea51\") " pod="openstack/glance-default-external-api-0" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.170410 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.193326 4721 generic.go:334] "Generic (PLEG): container finished" podID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerID="8d37382e602df793413e2180a90609111293c970770dc8adb79b0513cd945df9" exitCode=0 Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.193397 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8075cf6d-3ae0-468e-98cb-5f341d78b8ac","Type":"ContainerDied","Data":"8d37382e602df793413e2180a90609111293c970770dc8adb79b0513cd945df9"} Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.581223 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-9czmj" podUID="f1c8acb8-ffe3-467a-a19b-9ccdc0fca18f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.212:5353: i/o timeout" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.863167 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.921098 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-combined-ca-bundle\") pod \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.921203 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-httpd-run\") pod \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.921262 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6x5k\" (UniqueName: \"kubernetes.io/projected/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-kube-api-access-r6x5k\") pod \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.930166 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8075cf6d-3ae0-468e-98cb-5f341d78b8ac" (UID: "8075cf6d-3ae0-468e-98cb-5f341d78b8ac"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.934299 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.941160 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-logs\") pod \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.949421 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-logs" (OuterVolumeSpecName: "logs") pod "8075cf6d-3ae0-468e-98cb-5f341d78b8ac" (UID: "8075cf6d-3ae0-468e-98cb-5f341d78b8ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.952530 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-kube-api-access-r6x5k" (OuterVolumeSpecName: "kube-api-access-r6x5k") pod "8075cf6d-3ae0-468e-98cb-5f341d78b8ac" (UID: "8075cf6d-3ae0-468e-98cb-5f341d78b8ac"). InnerVolumeSpecName "kube-api-access-r6x5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.953085 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-internal-tls-certs\") pod \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.953245 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-scripts\") pod \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.953292 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-config-data\") pod \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\" (UID: \"8075cf6d-3ae0-468e-98cb-5f341d78b8ac\") " Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.954502 4721 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.954517 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6x5k\" (UniqueName: \"kubernetes.io/projected/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-kube-api-access-r6x5k\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.954527 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:19 crc kubenswrapper[4721]: I0202 13:25:19.966342 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-scripts" (OuterVolumeSpecName: "scripts") pod "8075cf6d-3ae0-468e-98cb-5f341d78b8ac" (UID: "8075cf6d-3ae0-468e-98cb-5f341d78b8ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.045599 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8075cf6d-3ae0-468e-98cb-5f341d78b8ac" (UID: "8075cf6d-3ae0-468e-98cb-5f341d78b8ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.087432 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.087471 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.132600 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8075cf6d-3ae0-468e-98cb-5f341d78b8ac" (UID: "8075cf6d-3ae0-468e-98cb-5f341d78b8ac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.148316 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79" (OuterVolumeSpecName: "glance") pod "8075cf6d-3ae0-468e-98cb-5f341d78b8ac" (UID: "8075cf6d-3ae0-468e-98cb-5f341d78b8ac"). InnerVolumeSpecName "pvc-56daf3c3-162c-4970-aab6-c4cecea22e79". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.214401 4721 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") on node \"crc\" " Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.214446 4721 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.225455 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.243854 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-config-data" (OuterVolumeSpecName: "config-data") pod "8075cf6d-3ae0-468e-98cb-5f341d78b8ac" (UID: "8075cf6d-3ae0-468e-98cb-5f341d78b8ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.269365 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8075cf6d-3ae0-468e-98cb-5f341d78b8ac","Type":"ContainerDied","Data":"901860dc4f2cb59ed85a54f8bc10b9859a36c07381edb1151f125b84138e4df8"} Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.269443 4721 scope.go:117] "RemoveContainer" containerID="8d37382e602df793413e2180a90609111293c970770dc8adb79b0513cd945df9" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.269544 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.299734 4721 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.299934 4721 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-56daf3c3-162c-4970-aab6-c4cecea22e79" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79") on node "crc" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.316927 4721 reconciler_common.go:293] "Volume detached for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.316985 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8075cf6d-3ae0-468e-98cb-5f341d78b8ac-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.398401 4721 scope.go:117] "RemoveContainer" containerID="6b2f4ace0eb75cc9899716f3900092d7c461a3838ae11f1c1f66ca5572586d0f" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.462362 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cdd3f19-3e66-4807-a0e8-957c713cef36" path="/var/lib/kubelet/pods/7cdd3f19-3e66-4807-a0e8-957c713cef36/volumes" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.602059 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.618395 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.637088 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:25:20 crc kubenswrapper[4721]: E0202 13:25:20.637632 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerName="glance-httpd" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.637645 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerName="glance-httpd" Feb 02 13:25:20 crc kubenswrapper[4721]: E0202 13:25:20.637662 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerName="glance-log" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.637668 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerName="glance-log" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.637863 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerName="glance-log" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.637884 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" containerName="glance-httpd" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.641892 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.646188 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.646511 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.693595 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.831384 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-22bg7"] Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.833350 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.843188 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.843471 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.843641 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.843725 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.843842 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.843996 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v68kr\" (UniqueName: \"kubernetes.io/projected/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-kube-api-access-v68kr\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.844270 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.844341 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.849641 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-22bg7"] Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948235 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v68kr\" (UniqueName: \"kubernetes.io/projected/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-kube-api-access-v68kr\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948295 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948338 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948396 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948443 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz92z\" (UniqueName: \"kubernetes.io/projected/31d33cb7-8d98-44cc-97ef-229d34805e46-kube-api-access-cz92z\") pod \"nova-api-db-create-22bg7\" (UID: \"31d33cb7-8d98-44cc-97ef-229d34805e46\") " pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948506 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948537 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d33cb7-8d98-44cc-97ef-229d34805e46-operator-scripts\") pod \"nova-api-db-create-22bg7\" (UID: \"31d33cb7-8d98-44cc-97ef-229d34805e46\") " pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948661 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948693 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.948770 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.951510 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.952053 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.955161 4721 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.955203 4721 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f813ebfbde533117d7c5539927c335132efd30cff3e1cb355d78cb9d4c1a927/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.963273 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.963479 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.963680 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.968257 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:20 crc kubenswrapper[4721]: I0202 13:25:20.971160 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v68kr\" (UniqueName: \"kubernetes.io/projected/5f5129b1-fc26-40ba-9cf7-0f86e93507cd-kube-api-access-v68kr\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.045232 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-zbntf"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.048020 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.051119 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz92z\" (UniqueName: \"kubernetes.io/projected/31d33cb7-8d98-44cc-97ef-229d34805e46-kube-api-access-cz92z\") pod \"nova-api-db-create-22bg7\" (UID: \"31d33cb7-8d98-44cc-97ef-229d34805e46\") " pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.051202 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d33cb7-8d98-44cc-97ef-229d34805e46-operator-scripts\") pod \"nova-api-db-create-22bg7\" (UID: \"31d33cb7-8d98-44cc-97ef-229d34805e46\") " pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.051970 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d33cb7-8d98-44cc-97ef-229d34805e46-operator-scripts\") pod \"nova-api-db-create-22bg7\" (UID: \"31d33cb7-8d98-44cc-97ef-229d34805e46\") " pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.086826 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz92z\" (UniqueName: \"kubernetes.io/projected/31d33cb7-8d98-44cc-97ef-229d34805e46-kube-api-access-cz92z\") pod \"nova-api-db-create-22bg7\" (UID: \"31d33cb7-8d98-44cc-97ef-229d34805e46\") " pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.109199 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zbntf"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.135002 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56daf3c3-162c-4970-aab6-c4cecea22e79\") pod \"glance-default-internal-api-0\" (UID: \"5f5129b1-fc26-40ba-9cf7-0f86e93507cd\") " pod="openstack/glance-default-internal-api-0" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.146390 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="0eabfa0b-0304-4eda-8f8a-dc9160569e4b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.228:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.157103 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmnpw\" (UniqueName: \"kubernetes.io/projected/84e297f9-7808-4195-86b2-2c17f4638bf2-kube-api-access-bmnpw\") pod \"nova-cell0-db-create-zbntf\" (UID: \"84e297f9-7808-4195-86b2-2c17f4638bf2\") " pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.157312 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e297f9-7808-4195-86b2-2c17f4638bf2-operator-scripts\") pod \"nova-cell0-db-create-zbntf\" (UID: \"84e297f9-7808-4195-86b2-2c17f4638bf2\") " pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.195527 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.241814 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7hfxs"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.244918 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.261097 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmnpw\" (UniqueName: \"kubernetes.io/projected/84e297f9-7808-4195-86b2-2c17f4638bf2-kube-api-access-bmnpw\") pod \"nova-cell0-db-create-zbntf\" (UID: \"84e297f9-7808-4195-86b2-2c17f4638bf2\") " pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.261266 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e297f9-7808-4195-86b2-2c17f4638bf2-operator-scripts\") pod \"nova-cell0-db-create-zbntf\" (UID: \"84e297f9-7808-4195-86b2-2c17f4638bf2\") " pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.262285 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e297f9-7808-4195-86b2-2c17f4638bf2-operator-scripts\") pod \"nova-cell0-db-create-zbntf\" (UID: \"84e297f9-7808-4195-86b2-2c17f4638bf2\") " pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.264319 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7hfxs"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.303688 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.308542 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmnpw\" (UniqueName: \"kubernetes.io/projected/84e297f9-7808-4195-86b2-2c17f4638bf2-kube-api-access-bmnpw\") pod \"nova-cell0-db-create-zbntf\" (UID: \"84e297f9-7808-4195-86b2-2c17f4638bf2\") " pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.315678 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-596786fd64-rpzql" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.317235 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ecf5-account-create-update-k6kdv"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.322259 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.334462 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.347923 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.360556 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ecf5-account-create-update-k6kdv"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.363863 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flbcl\" (UniqueName: \"kubernetes.io/projected/147719a3-96ca-4551-a395-648dd45b4ce6-kube-api-access-flbcl\") pod \"nova-cell1-db-create-7hfxs\" (UID: \"147719a3-96ca-4551-a395-648dd45b4ce6\") " pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.363927 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147719a3-96ca-4551-a395-648dd45b4ce6-operator-scripts\") pod \"nova-cell1-db-create-7hfxs\" (UID: \"147719a3-96ca-4551-a395-648dd45b4ce6\") " pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.441045 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23e9328e-fd9a-4a87-946b-2c46e25bea51","Type":"ContainerStarted","Data":"03d1e984fbc3dacab36da010e889edf1bc89ded6833cebae3aae0dcfb162406b"} Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.489029 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147719a3-96ca-4551-a395-648dd45b4ce6-operator-scripts\") pod \"nova-cell1-db-create-7hfxs\" (UID: \"147719a3-96ca-4551-a395-648dd45b4ce6\") " pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.489187 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k25zs\" (UniqueName: \"kubernetes.io/projected/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-kube-api-access-k25zs\") pod \"nova-api-ecf5-account-create-update-k6kdv\" (UID: \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\") " pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.489542 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-operator-scripts\") pod \"nova-api-ecf5-account-create-update-k6kdv\" (UID: \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\") " pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.490119 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flbcl\" (UniqueName: \"kubernetes.io/projected/147719a3-96ca-4551-a395-648dd45b4ce6-kube-api-access-flbcl\") pod \"nova-cell1-db-create-7hfxs\" (UID: \"147719a3-96ca-4551-a395-648dd45b4ce6\") " pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.512806 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147719a3-96ca-4551-a395-648dd45b4ce6-operator-scripts\") pod \"nova-cell1-db-create-7hfxs\" (UID: \"147719a3-96ca-4551-a395-648dd45b4ce6\") " pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.572911 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flbcl\" (UniqueName: \"kubernetes.io/projected/147719a3-96ca-4551-a395-648dd45b4ce6-kube-api-access-flbcl\") pod \"nova-cell1-db-create-7hfxs\" (UID: \"147719a3-96ca-4551-a395-648dd45b4ce6\") " pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.608979 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5c37-account-create-update-h9w2m"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.615114 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.618136 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k25zs\" (UniqueName: \"kubernetes.io/projected/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-kube-api-access-k25zs\") pod \"nova-api-ecf5-account-create-update-k6kdv\" (UID: \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\") " pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.618248 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-operator-scripts\") pod \"nova-api-ecf5-account-create-update-k6kdv\" (UID: \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\") " pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.619240 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-operator-scripts\") pod \"nova-api-ecf5-account-create-update-k6kdv\" (UID: \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\") " pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.619586 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.632950 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5c37-account-create-update-h9w2m"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.662796 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k25zs\" (UniqueName: \"kubernetes.io/projected/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-kube-api-access-k25zs\") pod \"nova-api-ecf5-account-create-update-k6kdv\" (UID: \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\") " pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.721575 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-69599c8b5f-rjs76"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.722211 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-69599c8b5f-rjs76" podUID="ce072a84-75da-4060-9c4a-d029b3a14947" containerName="heat-engine" containerID="cri-o://824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" gracePeriod=60 Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.728683 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dd874ae-fdb8-4f98-ae51-dac54a44e001-operator-scripts\") pod \"nova-cell0-5c37-account-create-update-h9w2m\" (UID: \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\") " pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.728777 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djb56\" (UniqueName: \"kubernetes.io/projected/0dd874ae-fdb8-4f98-ae51-dac54a44e001-kube-api-access-djb56\") pod \"nova-cell0-5c37-account-create-update-h9w2m\" (UID: \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\") " pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.733999 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlf4"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.739731 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.754158 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlf4"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.755351 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:21 crc kubenswrapper[4721]: E0202 13:25:21.756478 4721 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 13:25:21 crc kubenswrapper[4721]: E0202 13:25:21.787500 4721 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.805937 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:21 crc kubenswrapper[4721]: E0202 13:25:21.821671 4721 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 13:25:21 crc kubenswrapper[4721]: E0202 13:25:21.822597 4721 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-69599c8b5f-rjs76" podUID="ce072a84-75da-4060-9c4a-d029b3a14947" containerName="heat-engine" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.831637 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-catalog-content\") pod \"redhat-marketplace-bjlf4\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.831688 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvkl7\" (UniqueName: \"kubernetes.io/projected/f1487506-5263-4ffe-b3e0-1a7a507590f9-kube-api-access-rvkl7\") pod \"redhat-marketplace-bjlf4\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.831722 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-utilities\") pod \"redhat-marketplace-bjlf4\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.831832 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dd874ae-fdb8-4f98-ae51-dac54a44e001-operator-scripts\") pod \"nova-cell0-5c37-account-create-update-h9w2m\" (UID: \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\") " pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.832485 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djb56\" (UniqueName: \"kubernetes.io/projected/0dd874ae-fdb8-4f98-ae51-dac54a44e001-kube-api-access-djb56\") pod \"nova-cell0-5c37-account-create-update-h9w2m\" (UID: \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\") " pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.833239 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dd874ae-fdb8-4f98-ae51-dac54a44e001-operator-scripts\") pod \"nova-cell0-5c37-account-create-update-h9w2m\" (UID: \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\") " pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.839261 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6b23-account-create-update-5q82h"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.841373 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.849251 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.869723 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djb56\" (UniqueName: \"kubernetes.io/projected/0dd874ae-fdb8-4f98-ae51-dac54a44e001-kube-api-access-djb56\") pod \"nova-cell0-5c37-account-create-update-h9w2m\" (UID: \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\") " pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.905633 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6b23-account-create-update-5q82h"] Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.949433 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011e7b6f-64eb-48b5-be89-8304581d4c5f-operator-scripts\") pod \"nova-cell1-6b23-account-create-update-5q82h\" (UID: \"011e7b6f-64eb-48b5-be89-8304581d4c5f\") " pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.950057 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-catalog-content\") pod \"redhat-marketplace-bjlf4\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.950162 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvkl7\" (UniqueName: \"kubernetes.io/projected/f1487506-5263-4ffe-b3e0-1a7a507590f9-kube-api-access-rvkl7\") pod \"redhat-marketplace-bjlf4\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.966536 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-utilities\") pod \"redhat-marketplace-bjlf4\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.967142 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.967745 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdp4w\" (UniqueName: \"kubernetes.io/projected/011e7b6f-64eb-48b5-be89-8304581d4c5f-kube-api-access-kdp4w\") pod \"nova-cell1-6b23-account-create-update-5q82h\" (UID: \"011e7b6f-64eb-48b5-be89-8304581d4c5f\") " pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.968248 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-utilities\") pod \"redhat-marketplace-bjlf4\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:21 crc kubenswrapper[4721]: I0202 13:25:21.968507 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-catalog-content\") pod \"redhat-marketplace-bjlf4\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.024874 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvkl7\" (UniqueName: \"kubernetes.io/projected/f1487506-5263-4ffe-b3e0-1a7a507590f9-kube-api-access-rvkl7\") pod \"redhat-marketplace-bjlf4\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.070600 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011e7b6f-64eb-48b5-be89-8304581d4c5f-operator-scripts\") pod \"nova-cell1-6b23-account-create-update-5q82h\" (UID: \"011e7b6f-64eb-48b5-be89-8304581d4c5f\") " pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.070841 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdp4w\" (UniqueName: \"kubernetes.io/projected/011e7b6f-64eb-48b5-be89-8304581d4c5f-kube-api-access-kdp4w\") pod \"nova-cell1-6b23-account-create-update-5q82h\" (UID: \"011e7b6f-64eb-48b5-be89-8304581d4c5f\") " pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.073733 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011e7b6f-64eb-48b5-be89-8304581d4c5f-operator-scripts\") pod \"nova-cell1-6b23-account-create-update-5q82h\" (UID: \"011e7b6f-64eb-48b5-be89-8304581d4c5f\") " pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.115705 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdp4w\" (UniqueName: \"kubernetes.io/projected/011e7b6f-64eb-48b5-be89-8304581d4c5f-kube-api-access-kdp4w\") pod \"nova-cell1-6b23-account-create-update-5q82h\" (UID: \"011e7b6f-64eb-48b5-be89-8304581d4c5f\") " pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.124164 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.147499 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="0eabfa0b-0304-4eda-8f8a-dc9160569e4b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.228:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.181674 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.353466 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-86974d69bd-t6gcz" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.459151 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8075cf6d-3ae0-468e-98cb-5f341d78b8ac" path="/var/lib/kubelet/pods/8075cf6d-3ae0-468e-98cb-5f341d78b8ac/volumes" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.503595 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-869db4994-hgxnh"] Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.592142 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-22bg7"] Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.656367 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23e9328e-fd9a-4a87-946b-2c46e25bea51","Type":"ContainerStarted","Data":"b6effbfcc05b55e6fc18e1c48e951cdd4e7c47b90122d732e13d24419723120f"} Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.685959 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-64c55c4cc7-4htzp" Feb 02 13:25:22 crc kubenswrapper[4721]: I0202 13:25:22.843833 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c76b54b86-n9kln"] Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.699814 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.774281 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-869db4994-hgxnh" event={"ID":"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8","Type":"ContainerDied","Data":"b49739d0b82310eb7a88b3289808bff2fac1bf21a0b025f6531793460d8748fd"} Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.774624 4721 scope.go:117] "RemoveContainer" containerID="9892fb62f9251fa656b60a855b779d0ef03617e58fee62707134474322dbab8a" Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.777012 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.818419 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-22bg7" event={"ID":"31d33cb7-8d98-44cc-97ef-229d34805e46","Type":"ContainerStarted","Data":"4fc7b5047fba1d65d14acd76d647401f0b37510534baf6ea3bf2254dfd744004"} Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.818511 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-22bg7" event={"ID":"31d33cb7-8d98-44cc-97ef-229d34805e46","Type":"ContainerStarted","Data":"17351fe868bc448175365bc355dc57fe9dc3ca03a3c9d151703111ff468d604c"} Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.847641 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zbntf"] Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.885917 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data-custom\") pod \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.886026 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95q9j\" (UniqueName: \"kubernetes.io/projected/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-kube-api-access-95q9j\") pod \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.886142 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-combined-ca-bundle\") pod \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.886273 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data\") pod \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.925093 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-kube-api-access-95q9j" (OuterVolumeSpecName: "kube-api-access-95q9j") pod "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" (UID: "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8"). InnerVolumeSpecName "kube-api-access-95q9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.936152 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-22bg7" podStartSLOduration=3.936101807 podStartE2EDuration="3.936101807s" podCreationTimestamp="2026-02-02 13:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:23.841437399 +0000 UTC m=+1464.143951788" watchObservedRunningTime="2026-02-02 13:25:23.936101807 +0000 UTC m=+1464.238616196" Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.938377 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" (UID: "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:23 crc kubenswrapper[4721]: I0202 13:25:23.959195 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" (UID: "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.040453 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data" (OuterVolumeSpecName: "config-data") pod "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" (UID: "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.057287 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data\") pod \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\" (UID: \"2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8\") " Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.058431 4721 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.058460 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95q9j\" (UniqueName: \"kubernetes.io/projected/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-kube-api-access-95q9j\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.058475 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:24 crc kubenswrapper[4721]: W0202 13:25:24.058803 4721 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8/volumes/kubernetes.io~secret/config-data Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.058822 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data" (OuterVolumeSpecName: "config-data") pod "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" (UID: "2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:24 crc kubenswrapper[4721]: E0202 13:25:24.122319 4721 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 13:25:24 crc kubenswrapper[4721]: E0202 13:25:24.162910 4721 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 13:25:24 crc kubenswrapper[4721]: E0202 13:25:24.185525 4721 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 13:25:24 crc kubenswrapper[4721]: E0202 13:25:24.185606 4721 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-69599c8b5f-rjs76" podUID="ce072a84-75da-4060-9c4a-d029b3a14947" containerName="heat-engine" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.205679 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.637531 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7hfxs"] Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.671153 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.675817 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ecf5-account-create-update-k6kdv"] Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.725969 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data-custom\") pod \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.726147 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vs22\" (UniqueName: \"kubernetes.io/projected/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-kube-api-access-7vs22\") pod \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.726190 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-combined-ca-bundle\") pod \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.726315 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data\") pod \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\" (UID: \"c5f7cb67-4d7c-4bc8-bf45-c949450206f0\") " Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.738894 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-kube-api-access-7vs22" (OuterVolumeSpecName: "kube-api-access-7vs22") pod "c5f7cb67-4d7c-4bc8-bf45-c949450206f0" (UID: "c5f7cb67-4d7c-4bc8-bf45-c949450206f0"). InnerVolumeSpecName "kube-api-access-7vs22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.756606 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c5f7cb67-4d7c-4bc8-bf45-c949450206f0" (UID: "c5f7cb67-4d7c-4bc8-bf45-c949450206f0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.796154 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5f7cb67-4d7c-4bc8-bf45-c949450206f0" (UID: "c5f7cb67-4d7c-4bc8-bf45-c949450206f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.846687 4721 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.846734 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vs22\" (UniqueName: \"kubernetes.io/projected/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-kube-api-access-7vs22\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.846747 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.854627 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data" (OuterVolumeSpecName: "config-data") pod "c5f7cb67-4d7c-4bc8-bf45-c949450206f0" (UID: "c5f7cb67-4d7c-4bc8-bf45-c949450206f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.893781 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" event={"ID":"c5f7cb67-4d7c-4bc8-bf45-c949450206f0","Type":"ContainerDied","Data":"622fe9b6176f587f10b3c6f779b6fa54763b7ccedb8408baf73c74843450fabf"} Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.893842 4721 scope.go:117] "RemoveContainer" containerID="fbc9c66b85258c4a25c9415a67c1a7b4969a3f4e6fb178430c5d1cfa81c475d3" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.893947 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6c76b54b86-n9kln" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.950891 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f7cb67-4d7c-4bc8-bf45-c949450206f0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.954985 4721 generic.go:334] "Generic (PLEG): container finished" podID="31d33cb7-8d98-44cc-97ef-229d34805e46" containerID="4fc7b5047fba1d65d14acd76d647401f0b37510534baf6ea3bf2254dfd744004" exitCode=0 Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.955057 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-22bg7" event={"ID":"31d33cb7-8d98-44cc-97ef-229d34805e46","Type":"ContainerDied","Data":"4fc7b5047fba1d65d14acd76d647401f0b37510534baf6ea3bf2254dfd744004"} Feb 02 13:25:24 crc kubenswrapper[4721]: I0202 13:25:24.973495 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5c37-account-create-update-h9w2m"] Feb 02 13:25:25 crc kubenswrapper[4721]: W0202 13:25:24.999796 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dd874ae_fdb8_4f98_ae51_dac54a44e001.slice/crio-878a87f71ad3d4e4d73726cacc66c5efd64f6c529a185c729f0711dd8a268409 WatchSource:0}: Error finding container 878a87f71ad3d4e4d73726cacc66c5efd64f6c529a185c729f0711dd8a268409: Status 404 returned error can't find the container with id 878a87f71ad3d4e4d73726cacc66c5efd64f6c529a185c729f0711dd8a268409 Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:24.999994 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7hfxs" event={"ID":"147719a3-96ca-4551-a395-648dd45b4ce6","Type":"ContainerStarted","Data":"a55ae8457b5a788504e1e84ee76eca10d4fc6aebec31718324d6620e0d97f659"} Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.003349 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ecf5-account-create-update-k6kdv" event={"ID":"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1","Type":"ContainerStarted","Data":"e8901988f690d98fcea72012066d25ce8684039b6b674156f34336f208a3df47"} Feb 02 13:25:25 crc kubenswrapper[4721]: W0202 13:25:25.008607 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod011e7b6f_64eb_48b5_be89_8304581d4c5f.slice/crio-5d76c238acf537be4b1d7f54733a1d3d5ad8fa6ffa042633ba6b38b8df9885e8 WatchSource:0}: Error finding container 5d76c238acf537be4b1d7f54733a1d3d5ad8fa6ffa042633ba6b38b8df9885e8: Status 404 returned error can't find the container with id 5d76c238acf537be4b1d7f54733a1d3d5ad8fa6ffa042633ba6b38b8df9885e8 Feb 02 13:25:25 crc kubenswrapper[4721]: W0202 13:25:25.014683 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1487506_5263_4ffe_b3e0_1a7a507590f9.slice/crio-baa4f2b8c7f38c357e7fd8029e914fb27779853104aeda245f29bdec34cc4815 WatchSource:0}: Error finding container baa4f2b8c7f38c357e7fd8029e914fb27779853104aeda245f29bdec34cc4815: Status 404 returned error can't find the container with id baa4f2b8c7f38c357e7fd8029e914fb27779853104aeda245f29bdec34cc4815 Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.023547 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23e9328e-fd9a-4a87-946b-2c46e25bea51","Type":"ContainerStarted","Data":"8bd63c89c14f1ccb69ddfb8d130bbe5d4974317b78ad4239a84a5e73a223cecb"} Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.030157 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlf4"] Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.041097 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6c76b54b86-n9kln"] Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.062597 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6c76b54b86-n9kln"] Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.072085 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-869db4994-hgxnh" Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.076663 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f5129b1-fc26-40ba-9cf7-0f86e93507cd","Type":"ContainerStarted","Data":"9c7d6cab875b3efe0c4b2c75baf8fffd41267d61cd17b3464709f39d27ada580"} Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.077710 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zbntf" event={"ID":"84e297f9-7808-4195-86b2-2c17f4638bf2","Type":"ContainerStarted","Data":"9f52f888eba9daf5e6e283524ff7481c4a05eeb6d1ae52e82e3fc8b08b7473c5"} Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.077730 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zbntf" event={"ID":"84e297f9-7808-4195-86b2-2c17f4638bf2","Type":"ContainerStarted","Data":"5f9fe02523a0af94db9eb026cf39983eed77325df1c327f0bf401d20a6a4ed62"} Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.083394 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6b23-account-create-update-5q82h"] Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.123378 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.123354387 podStartE2EDuration="7.123354387s" podCreationTimestamp="2026-02-02 13:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:25.050652535 +0000 UTC m=+1465.353166924" watchObservedRunningTime="2026-02-02 13:25:25.123354387 +0000 UTC m=+1465.425868786" Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.147266 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mlbxn" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" probeResult="failure" output=< Feb 02 13:25:25 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:25:25 crc kubenswrapper[4721]: > Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.195496 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-zbntf" podStartSLOduration=5.195473822 podStartE2EDuration="5.195473822s" podCreationTimestamp="2026-02-02 13:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:25.114644191 +0000 UTC m=+1465.417158580" watchObservedRunningTime="2026-02-02 13:25:25.195473822 +0000 UTC m=+1465.497988211" Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.245155 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-869db4994-hgxnh"] Feb 02 13:25:25 crc kubenswrapper[4721]: I0202 13:25:25.250252 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-869db4994-hgxnh"] Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.161435 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ecf5-account-create-update-k6kdv" event={"ID":"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1","Type":"ContainerStarted","Data":"d2ceca0ae565540870f02401237596a428eda4dcf72c0158ff8c7116acbfc486"} Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.166100 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="0eabfa0b-0304-4eda-8f8a-dc9160569e4b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.228:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.191378 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" event={"ID":"0dd874ae-fdb8-4f98-ae51-dac54a44e001","Type":"ContainerStarted","Data":"878a87f71ad3d4e4d73726cacc66c5efd64f6c529a185c729f0711dd8a268409"} Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.220362 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlf4" event={"ID":"f1487506-5263-4ffe-b3e0-1a7a507590f9","Type":"ContainerStarted","Data":"4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d"} Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.220415 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlf4" event={"ID":"f1487506-5263-4ffe-b3e0-1a7a507590f9","Type":"ContainerStarted","Data":"baa4f2b8c7f38c357e7fd8029e914fb27779853104aeda245f29bdec34cc4815"} Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.247772 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-ecf5-account-create-update-k6kdv" podStartSLOduration=5.247746731 podStartE2EDuration="5.247746731s" podCreationTimestamp="2026-02-02 13:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:26.201653301 +0000 UTC m=+1466.504167700" watchObservedRunningTime="2026-02-02 13:25:26.247746731 +0000 UTC m=+1466.550261140" Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.248316 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6b23-account-create-update-5q82h" event={"ID":"011e7b6f-64eb-48b5-be89-8304581d4c5f","Type":"ContainerStarted","Data":"a4df6290a6ff822c9798aad4bb78bddad86f6ee3871a5520d115e6e491f3950e"} Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.248374 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6b23-account-create-update-5q82h" event={"ID":"011e7b6f-64eb-48b5-be89-8304581d4c5f","Type":"ContainerStarted","Data":"5d76c238acf537be4b1d7f54733a1d3d5ad8fa6ffa042633ba6b38b8df9885e8"} Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.285299 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f5129b1-fc26-40ba-9cf7-0f86e93507cd","Type":"ContainerStarted","Data":"1e83aec53278f669956b6c12748294af646707461d1e4d3bcbf8fa106771c9c4"} Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.310407 4721 generic.go:334] "Generic (PLEG): container finished" podID="84e297f9-7808-4195-86b2-2c17f4638bf2" containerID="9f52f888eba9daf5e6e283524ff7481c4a05eeb6d1ae52e82e3fc8b08b7473c5" exitCode=0 Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.310766 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zbntf" event={"ID":"84e297f9-7808-4195-86b2-2c17f4638bf2","Type":"ContainerDied","Data":"9f52f888eba9daf5e6e283524ff7481c4a05eeb6d1ae52e82e3fc8b08b7473c5"} Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.328286 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-6b23-account-create-update-5q82h" podStartSLOduration=5.328262514 podStartE2EDuration="5.328262514s" podCreationTimestamp="2026-02-02 13:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:26.289184874 +0000 UTC m=+1466.591699263" watchObservedRunningTime="2026-02-02 13:25:26.328262514 +0000 UTC m=+1466.630776903" Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.360815 4721 generic.go:334] "Generic (PLEG): container finished" podID="147719a3-96ca-4551-a395-648dd45b4ce6" containerID="663f048cc96da678155c1a95cb9691e5043c9f03a4259611e0be7f358517c2f1" exitCode=0 Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.361216 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7hfxs" event={"ID":"147719a3-96ca-4551-a395-648dd45b4ce6","Type":"ContainerDied","Data":"663f048cc96da678155c1a95cb9691e5043c9f03a4259611e0be7f358517c2f1"} Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.455341 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" path="/var/lib/kubelet/pods/2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8/volumes" Feb 02 13:25:26 crc kubenswrapper[4721]: I0202 13:25:26.455972 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" path="/var/lib/kubelet/pods/c5f7cb67-4d7c-4bc8-bf45-c949450206f0/volumes" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.098724 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.152405 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="0eabfa0b-0304-4eda-8f8a-dc9160569e4b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.228:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.169214 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.238900 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz92z\" (UniqueName: \"kubernetes.io/projected/31d33cb7-8d98-44cc-97ef-229d34805e46-kube-api-access-cz92z\") pod \"31d33cb7-8d98-44cc-97ef-229d34805e46\" (UID: \"31d33cb7-8d98-44cc-97ef-229d34805e46\") " Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.239373 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d33cb7-8d98-44cc-97ef-229d34805e46-operator-scripts\") pod \"31d33cb7-8d98-44cc-97ef-229d34805e46\" (UID: \"31d33cb7-8d98-44cc-97ef-229d34805e46\") " Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.244237 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d33cb7-8d98-44cc-97ef-229d34805e46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31d33cb7-8d98-44cc-97ef-229d34805e46" (UID: "31d33cb7-8d98-44cc-97ef-229d34805e46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.256216 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d33cb7-8d98-44cc-97ef-229d34805e46-kube-api-access-cz92z" (OuterVolumeSpecName: "kube-api-access-cz92z") pod "31d33cb7-8d98-44cc-97ef-229d34805e46" (UID: "31d33cb7-8d98-44cc-97ef-229d34805e46"). InnerVolumeSpecName "kube-api-access-cz92z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.346121 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31d33cb7-8d98-44cc-97ef-229d34805e46-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.346591 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz92z\" (UniqueName: \"kubernetes.io/projected/31d33cb7-8d98-44cc-97ef-229d34805e46-kube-api-access-cz92z\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.407994 4721 generic.go:334] "Generic (PLEG): container finished" podID="0dd874ae-fdb8-4f98-ae51-dac54a44e001" containerID="a5de70229867c13b14cef49c003c51ee5606bf312afc995295088db8627fee73" exitCode=0 Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.408153 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" event={"ID":"0dd874ae-fdb8-4f98-ae51-dac54a44e001","Type":"ContainerDied","Data":"a5de70229867c13b14cef49c003c51ee5606bf312afc995295088db8627fee73"} Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.438338 4721 generic.go:334] "Generic (PLEG): container finished" podID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerID="22439dbfbc2d5b9a7aaa38b029bb7c102abafe93d1e9397f189c54e9d9a93d8a" exitCode=0 Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.438496 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerDied","Data":"22439dbfbc2d5b9a7aaa38b029bb7c102abafe93d1e9397f189c54e9d9a93d8a"} Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.456421 4721 generic.go:334] "Generic (PLEG): container finished" podID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerID="4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d" exitCode=0 Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.456507 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlf4" event={"ID":"f1487506-5263-4ffe-b3e0-1a7a507590f9","Type":"ContainerDied","Data":"4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d"} Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.464615 4721 generic.go:334] "Generic (PLEG): container finished" podID="011e7b6f-64eb-48b5-be89-8304581d4c5f" containerID="a4df6290a6ff822c9798aad4bb78bddad86f6ee3871a5520d115e6e491f3950e" exitCode=0 Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.464695 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6b23-account-create-update-5q82h" event={"ID":"011e7b6f-64eb-48b5-be89-8304581d4c5f","Type":"ContainerDied","Data":"a4df6290a6ff822c9798aad4bb78bddad86f6ee3871a5520d115e6e491f3950e"} Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.467810 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f5129b1-fc26-40ba-9cf7-0f86e93507cd","Type":"ContainerStarted","Data":"df04a82f1d41907c901afd253e9b94f6426419fca2571acca9d0b478ca4084dd"} Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.502408 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-22bg7" event={"ID":"31d33cb7-8d98-44cc-97ef-229d34805e46","Type":"ContainerDied","Data":"17351fe868bc448175365bc355dc57fe9dc3ca03a3c9d151703111ff468d604c"} Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.502449 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17351fe868bc448175365bc355dc57fe9dc3ca03a3c9d151703111ff468d604c" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.502533 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-22bg7" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.508244 4721 generic.go:334] "Generic (PLEG): container finished" podID="a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1" containerID="d2ceca0ae565540870f02401237596a428eda4dcf72c0158ff8c7116acbfc486" exitCode=0 Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.508514 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ecf5-account-create-update-k6kdv" event={"ID":"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1","Type":"ContainerDied","Data":"d2ceca0ae565540870f02401237596a428eda4dcf72c0158ff8c7116acbfc486"} Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.525703 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.52567418 podStartE2EDuration="7.52567418s" podCreationTimestamp="2026-02-02 13:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:25:27.520584262 +0000 UTC m=+1467.823098651" watchObservedRunningTime="2026-02-02 13:25:27.52567418 +0000 UTC m=+1467.828188569" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.813335 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.990297 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-config-data\") pod \"679713b8-7e9b-4ccc-87f3-85afd17dc008\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.990401 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-run-httpd\") pod \"679713b8-7e9b-4ccc-87f3-85afd17dc008\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.990421 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-scripts\") pod \"679713b8-7e9b-4ccc-87f3-85afd17dc008\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.990524 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-combined-ca-bundle\") pod \"679713b8-7e9b-4ccc-87f3-85afd17dc008\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.990582 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7n46\" (UniqueName: \"kubernetes.io/projected/679713b8-7e9b-4ccc-87f3-85afd17dc008-kube-api-access-r7n46\") pod \"679713b8-7e9b-4ccc-87f3-85afd17dc008\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.990796 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-log-httpd\") pod \"679713b8-7e9b-4ccc-87f3-85afd17dc008\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.990877 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-sg-core-conf-yaml\") pod \"679713b8-7e9b-4ccc-87f3-85afd17dc008\" (UID: \"679713b8-7e9b-4ccc-87f3-85afd17dc008\") " Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.991192 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "679713b8-7e9b-4ccc-87f3-85afd17dc008" (UID: "679713b8-7e9b-4ccc-87f3-85afd17dc008"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.991593 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "679713b8-7e9b-4ccc-87f3-85afd17dc008" (UID: "679713b8-7e9b-4ccc-87f3-85afd17dc008"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:27 crc kubenswrapper[4721]: I0202 13:25:27.991715 4721 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.032559 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-scripts" (OuterVolumeSpecName: "scripts") pod "679713b8-7e9b-4ccc-87f3-85afd17dc008" (UID: "679713b8-7e9b-4ccc-87f3-85afd17dc008"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.032732 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/679713b8-7e9b-4ccc-87f3-85afd17dc008-kube-api-access-r7n46" (OuterVolumeSpecName: "kube-api-access-r7n46") pod "679713b8-7e9b-4ccc-87f3-85afd17dc008" (UID: "679713b8-7e9b-4ccc-87f3-85afd17dc008"). InnerVolumeSpecName "kube-api-access-r7n46". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.095698 4721 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/679713b8-7e9b-4ccc-87f3-85afd17dc008-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.095735 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.095746 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7n46\" (UniqueName: \"kubernetes.io/projected/679713b8-7e9b-4ccc-87f3-85afd17dc008-kube-api-access-r7n46\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.118319 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "679713b8-7e9b-4ccc-87f3-85afd17dc008" (UID: "679713b8-7e9b-4ccc-87f3-85afd17dc008"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.239480 4721 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.264819 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.357935 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-config-data" (OuterVolumeSpecName: "config-data") pod "679713b8-7e9b-4ccc-87f3-85afd17dc008" (UID: "679713b8-7e9b-4ccc-87f3-85afd17dc008"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.376303 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "679713b8-7e9b-4ccc-87f3-85afd17dc008" (UID: "679713b8-7e9b-4ccc-87f3-85afd17dc008"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.455608 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmnpw\" (UniqueName: \"kubernetes.io/projected/84e297f9-7808-4195-86b2-2c17f4638bf2-kube-api-access-bmnpw\") pod \"84e297f9-7808-4195-86b2-2c17f4638bf2\" (UID: \"84e297f9-7808-4195-86b2-2c17f4638bf2\") " Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.455769 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e297f9-7808-4195-86b2-2c17f4638bf2-operator-scripts\") pod \"84e297f9-7808-4195-86b2-2c17f4638bf2\" (UID: \"84e297f9-7808-4195-86b2-2c17f4638bf2\") " Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.457608 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.457650 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679713b8-7e9b-4ccc-87f3-85afd17dc008-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.458916 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84e297f9-7808-4195-86b2-2c17f4638bf2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84e297f9-7808-4195-86b2-2c17f4638bf2" (UID: "84e297f9-7808-4195-86b2-2c17f4638bf2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.461543 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e297f9-7808-4195-86b2-2c17f4638bf2-kube-api-access-bmnpw" (OuterVolumeSpecName: "kube-api-access-bmnpw") pod "84e297f9-7808-4195-86b2-2c17f4638bf2" (UID: "84e297f9-7808-4195-86b2-2c17f4638bf2"). InnerVolumeSpecName "kube-api-access-bmnpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.468560 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.533293 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7hfxs" event={"ID":"147719a3-96ca-4551-a395-648dd45b4ce6","Type":"ContainerDied","Data":"a55ae8457b5a788504e1e84ee76eca10d4fc6aebec31718324d6620e0d97f659"} Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.533355 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a55ae8457b5a788504e1e84ee76eca10d4fc6aebec31718324d6620e0d97f659" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.533422 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7hfxs" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.540764 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"679713b8-7e9b-4ccc-87f3-85afd17dc008","Type":"ContainerDied","Data":"6f08f3fdbee7bda0664d6d60e3f3f3c62d78c95caee038e2d39526bac0095990"} Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.540816 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.540989 4721 scope.go:117] "RemoveContainer" containerID="ab6eff23a607854558fcb2aa7cbae46f97306ff156928f4f08c74867721f4d2c" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.546612 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zbntf" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.546670 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zbntf" event={"ID":"84e297f9-7808-4195-86b2-2c17f4638bf2","Type":"ContainerDied","Data":"5f9fe02523a0af94db9eb026cf39983eed77325df1c327f0bf401d20a6a4ed62"} Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.546708 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f9fe02523a0af94db9eb026cf39983eed77325df1c327f0bf401d20a6a4ed62" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.559543 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147719a3-96ca-4551-a395-648dd45b4ce6-operator-scripts\") pod \"147719a3-96ca-4551-a395-648dd45b4ce6\" (UID: \"147719a3-96ca-4551-a395-648dd45b4ce6\") " Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.559626 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flbcl\" (UniqueName: \"kubernetes.io/projected/147719a3-96ca-4551-a395-648dd45b4ce6-kube-api-access-flbcl\") pod \"147719a3-96ca-4551-a395-648dd45b4ce6\" (UID: \"147719a3-96ca-4551-a395-648dd45b4ce6\") " Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.561742 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/147719a3-96ca-4551-a395-648dd45b4ce6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "147719a3-96ca-4551-a395-648dd45b4ce6" (UID: "147719a3-96ca-4551-a395-648dd45b4ce6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.581312 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/147719a3-96ca-4551-a395-648dd45b4ce6-kube-api-access-flbcl" (OuterVolumeSpecName: "kube-api-access-flbcl") pod "147719a3-96ca-4551-a395-648dd45b4ce6" (UID: "147719a3-96ca-4551-a395-648dd45b4ce6"). InnerVolumeSpecName "kube-api-access-flbcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.582249 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/147719a3-96ca-4551-a395-648dd45b4ce6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.582275 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flbcl\" (UniqueName: \"kubernetes.io/projected/147719a3-96ca-4551-a395-648dd45b4ce6-kube-api-access-flbcl\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.582290 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmnpw\" (UniqueName: \"kubernetes.io/projected/84e297f9-7808-4195-86b2-2c17f4638bf2-kube-api-access-bmnpw\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.582300 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84e297f9-7808-4195-86b2-2c17f4638bf2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.642329 4721 scope.go:117] "RemoveContainer" containerID="3ad385a2be5227562f34db7d77851850b136530db655dfc87e87234604886068" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.643415 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.680188 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727018 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727597 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d33cb7-8d98-44cc-97ef-229d34805e46" containerName="mariadb-database-create" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727615 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d33cb7-8d98-44cc-97ef-229d34805e46" containerName="mariadb-database-create" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727640 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="ceilometer-notification-agent" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727648 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="ceilometer-notification-agent" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727669 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" containerName="heat-cfnapi" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727677 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" containerName="heat-cfnapi" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727698 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" containerName="heat-api" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727705 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" containerName="heat-api" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727722 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" containerName="heat-api" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727729 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" containerName="heat-api" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727745 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="proxy-httpd" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727751 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="proxy-httpd" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727764 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="ceilometer-central-agent" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727769 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="ceilometer-central-agent" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727784 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e297f9-7808-4195-86b2-2c17f4638bf2" containerName="mariadb-database-create" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727790 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e297f9-7808-4195-86b2-2c17f4638bf2" containerName="mariadb-database-create" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727798 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="147719a3-96ca-4551-a395-648dd45b4ce6" containerName="mariadb-database-create" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727804 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="147719a3-96ca-4551-a395-648dd45b4ce6" containerName="mariadb-database-create" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.727815 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="sg-core" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.727821 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="sg-core" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728102 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" containerName="heat-api" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728117 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="proxy-httpd" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728129 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc9b06c-1d5d-4e57-b1ab-13bdd82098b8" containerName="heat-api" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728139 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="ceilometer-notification-agent" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728154 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d33cb7-8d98-44cc-97ef-229d34805e46" containerName="mariadb-database-create" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728163 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" containerName="heat-cfnapi" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728177 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="ceilometer-central-agent" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728186 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="147719a3-96ca-4551-a395-648dd45b4ce6" containerName="mariadb-database-create" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728198 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" containerName="heat-cfnapi" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728206 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" containerName="sg-core" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728220 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e297f9-7808-4195-86b2-2c17f4638bf2" containerName="mariadb-database-create" Feb 02 13:25:28 crc kubenswrapper[4721]: E0202 13:25:28.728445 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" containerName="heat-cfnapi" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.728453 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f7cb67-4d7c-4bc8-bf45-c949450206f0" containerName="heat-cfnapi" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.743666 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.744420 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.749491 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.749606 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.757297 4721 scope.go:117] "RemoveContainer" containerID="f207fef2f76230ae29ed92563bfe6192bf4e5751fb81be7d8bc1400d4d42e087" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.889500 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-config-data\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.890034 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.890447 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-scripts\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.890504 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.890527 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-run-httpd\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.890586 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-log-httpd\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.890608 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85frw\" (UniqueName: \"kubernetes.io/projected/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-kube-api-access-85frw\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.996780 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.996979 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-scripts\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.997000 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.997023 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-run-httpd\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.997087 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-log-httpd\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.997107 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85frw\" (UniqueName: \"kubernetes.io/projected/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-kube-api-access-85frw\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.997217 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-config-data\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.997715 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-log-httpd\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:28 crc kubenswrapper[4721]: I0202 13:25:28.997740 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-run-httpd\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.004282 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-config-data\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.006801 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-scripts\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.014519 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.031920 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85frw\" (UniqueName: \"kubernetes.io/projected/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-kube-api-access-85frw\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.035829 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " pod="openstack/ceilometer-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.069403 4721 scope.go:117] "RemoveContainer" containerID="22439dbfbc2d5b9a7aaa38b029bb7c102abafe93d1e9397f189c54e9d9a93d8a" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.090653 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.129344 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.172395 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.172446 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.226137 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djb56\" (UniqueName: \"kubernetes.io/projected/0dd874ae-fdb8-4f98-ae51-dac54a44e001-kube-api-access-djb56\") pod \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\" (UID: \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\") " Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.226399 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dd874ae-fdb8-4f98-ae51-dac54a44e001-operator-scripts\") pod \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\" (UID: \"0dd874ae-fdb8-4f98-ae51-dac54a44e001\") " Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.228037 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd874ae-fdb8-4f98-ae51-dac54a44e001-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0dd874ae-fdb8-4f98-ae51-dac54a44e001" (UID: "0dd874ae-fdb8-4f98-ae51-dac54a44e001"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.239697 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd874ae-fdb8-4f98-ae51-dac54a44e001-kube-api-access-djb56" (OuterVolumeSpecName: "kube-api-access-djb56") pod "0dd874ae-fdb8-4f98-ae51-dac54a44e001" (UID: "0dd874ae-fdb8-4f98-ae51-dac54a44e001"). InnerVolumeSpecName "kube-api-access-djb56". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.325158 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.329892 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dd874ae-fdb8-4f98-ae51-dac54a44e001-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.329927 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djb56\" (UniqueName: \"kubernetes.io/projected/0dd874ae-fdb8-4f98-ae51-dac54a44e001-kube-api-access-djb56\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.338285 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.526942 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.539618 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.583165 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ecf5-account-create-update-k6kdv" event={"ID":"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1","Type":"ContainerDied","Data":"e8901988f690d98fcea72012066d25ce8684039b6b674156f34336f208a3df47"} Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.583546 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8901988f690d98fcea72012066d25ce8684039b6b674156f34336f208a3df47" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.583630 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ecf5-account-create-update-k6kdv" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.621939 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.630655 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5c37-account-create-update-h9w2m" event={"ID":"0dd874ae-fdb8-4f98-ae51-dac54a44e001","Type":"ContainerDied","Data":"878a87f71ad3d4e4d73726cacc66c5efd64f6c529a185c729f0711dd8a268409"} Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.630716 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="878a87f71ad3d4e4d73726cacc66c5efd64f6c529a185c729f0711dd8a268409" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.639637 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-operator-scripts\") pod \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\" (UID: \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\") " Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.639859 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdp4w\" (UniqueName: \"kubernetes.io/projected/011e7b6f-64eb-48b5-be89-8304581d4c5f-kube-api-access-kdp4w\") pod \"011e7b6f-64eb-48b5-be89-8304581d4c5f\" (UID: \"011e7b6f-64eb-48b5-be89-8304581d4c5f\") " Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.639956 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k25zs\" (UniqueName: \"kubernetes.io/projected/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-kube-api-access-k25zs\") pod \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\" (UID: \"a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1\") " Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.640003 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011e7b6f-64eb-48b5-be89-8304581d4c5f-operator-scripts\") pod \"011e7b6f-64eb-48b5-be89-8304581d4c5f\" (UID: \"011e7b6f-64eb-48b5-be89-8304581d4c5f\") " Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.640903 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6b23-account-create-update-5q82h" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.641160 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6b23-account-create-update-5q82h" event={"ID":"011e7b6f-64eb-48b5-be89-8304581d4c5f","Type":"ContainerDied","Data":"5d76c238acf537be4b1d7f54733a1d3d5ad8fa6ffa042633ba6b38b8df9885e8"} Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.641194 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d76c238acf537be4b1d7f54733a1d3d5ad8fa6ffa042633ba6b38b8df9885e8" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.641215 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.641359 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.641475 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/011e7b6f-64eb-48b5-be89-8304581d4c5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "011e7b6f-64eb-48b5-be89-8304581d4c5f" (UID: "011e7b6f-64eb-48b5-be89-8304581d4c5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.643576 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1" (UID: "a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.647315 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011e7b6f-64eb-48b5-be89-8304581d4c5f-kube-api-access-kdp4w" (OuterVolumeSpecName: "kube-api-access-kdp4w") pod "011e7b6f-64eb-48b5-be89-8304581d4c5f" (UID: "011e7b6f-64eb-48b5-be89-8304581d4c5f"). InnerVolumeSpecName "kube-api-access-kdp4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.653747 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-kube-api-access-k25zs" (OuterVolumeSpecName: "kube-api-access-k25zs") pod "a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1" (UID: "a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1"). InnerVolumeSpecName "kube-api-access-k25zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.749050 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.749411 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdp4w\" (UniqueName: \"kubernetes.io/projected/011e7b6f-64eb-48b5-be89-8304581d4c5f-kube-api-access-kdp4w\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.749428 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k25zs\" (UniqueName: \"kubernetes.io/projected/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1-kube-api-access-k25zs\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:29 crc kubenswrapper[4721]: I0202 13:25:29.749439 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011e7b6f-64eb-48b5-be89-8304581d4c5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:30 crc kubenswrapper[4721]: I0202 13:25:30.257947 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:30 crc kubenswrapper[4721]: I0202 13:25:30.423626 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="679713b8-7e9b-4ccc-87f3-85afd17dc008" path="/var/lib/kubelet/pods/679713b8-7e9b-4ccc-87f3-85afd17dc008/volumes" Feb 02 13:25:30 crc kubenswrapper[4721]: I0202 13:25:30.661318 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlf4" event={"ID":"f1487506-5263-4ffe-b3e0-1a7a507590f9","Type":"ContainerStarted","Data":"286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d"} Feb 02 13:25:30 crc kubenswrapper[4721]: I0202 13:25:30.666668 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerStarted","Data":"17422626d1aef368e4a89ff57fcd151611e206eb400b70c99ece56e2a2b12dbd"} Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.304915 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.304962 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.365602 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.375260 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.678825 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.679197 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.679383 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.679422 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.882933 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9h27j"] Feb 02 13:25:31 crc kubenswrapper[4721]: E0202 13:25:31.883571 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1" containerName="mariadb-account-create-update" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.883596 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1" containerName="mariadb-account-create-update" Feb 02 13:25:31 crc kubenswrapper[4721]: E0202 13:25:31.883633 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd874ae-fdb8-4f98-ae51-dac54a44e001" containerName="mariadb-account-create-update" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.883642 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd874ae-fdb8-4f98-ae51-dac54a44e001" containerName="mariadb-account-create-update" Feb 02 13:25:31 crc kubenswrapper[4721]: E0202 13:25:31.883663 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011e7b6f-64eb-48b5-be89-8304581d4c5f" containerName="mariadb-account-create-update" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.883674 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="011e7b6f-64eb-48b5-be89-8304581d4c5f" containerName="mariadb-account-create-update" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.883987 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd874ae-fdb8-4f98-ae51-dac54a44e001" containerName="mariadb-account-create-update" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.884016 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="011e7b6f-64eb-48b5-be89-8304581d4c5f" containerName="mariadb-account-create-update" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.884030 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1" containerName="mariadb-account-create-update" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.884969 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.886833 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mq69d" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.887326 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.887628 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 13:25:31 crc kubenswrapper[4721]: I0202 13:25:31.902486 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9h27j"] Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.024378 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.024721 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc8q5\" (UniqueName: \"kubernetes.io/projected/f01b253a-c7c6-4c9e-a800-a1732ba06f37-kube-api-access-zc8q5\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.024848 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-config-data\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.025040 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-scripts\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.127310 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc8q5\" (UniqueName: \"kubernetes.io/projected/f01b253a-c7c6-4c9e-a800-a1732ba06f37-kube-api-access-zc8q5\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.127646 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-config-data\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.127706 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-scripts\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.127770 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.135614 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-scripts\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.135911 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.136706 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-config-data\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.148499 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc8q5\" (UniqueName: \"kubernetes.io/projected/f01b253a-c7c6-4c9e-a800-a1732ba06f37-kube-api-access-zc8q5\") pod \"nova-cell0-conductor-db-sync-9h27j\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.208081 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.711696 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerStarted","Data":"250d61da9c9aec53fa8403200cdb7bf07e9fd015ad372194d53e43f3edd63adb"} Feb 02 13:25:32 crc kubenswrapper[4721]: W0202 13:25:32.749625 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6 WatchSource:0}: Error finding container 5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6: Status 404 returned error can't find the container with id 5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6 Feb 02 13:25:32 crc kubenswrapper[4721]: I0202 13:25:32.762030 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9h27j"] Feb 02 13:25:33 crc kubenswrapper[4721]: I0202 13:25:33.747496 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerStarted","Data":"b0b5571eb117a42c85d555cc9f73ebf04bc2152bc212ed2d51da96cbfa4c0649"} Feb 02 13:25:33 crc kubenswrapper[4721]: I0202 13:25:33.755433 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9h27j" event={"ID":"f01b253a-c7c6-4c9e-a800-a1732ba06f37","Type":"ContainerStarted","Data":"5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6"} Feb 02 13:25:34 crc kubenswrapper[4721]: E0202 13:25:34.011864 4721 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 13:25:34 crc kubenswrapper[4721]: E0202 13:25:34.013292 4721 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 13:25:34 crc kubenswrapper[4721]: E0202 13:25:34.014641 4721 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 13:25:34 crc kubenswrapper[4721]: E0202 13:25:34.014674 4721 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-69599c8b5f-rjs76" podUID="ce072a84-75da-4060-9c4a-d029b3a14947" containerName="heat-engine" Feb 02 13:25:34 crc kubenswrapper[4721]: I0202 13:25:34.134785 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:25:34 crc kubenswrapper[4721]: I0202 13:25:34.194892 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:25:34 crc kubenswrapper[4721]: I0202 13:25:34.935277 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mlbxn"] Feb 02 13:25:35 crc kubenswrapper[4721]: I0202 13:25:35.199635 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:25:35 crc kubenswrapper[4721]: I0202 13:25:35.798392 4721 generic.go:334] "Generic (PLEG): container finished" podID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerID="286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d" exitCode=0 Feb 02 13:25:35 crc kubenswrapper[4721]: I0202 13:25:35.798488 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlf4" event={"ID":"f1487506-5263-4ffe-b3e0-1a7a507590f9","Type":"ContainerDied","Data":"286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d"} Feb 02 13:25:35 crc kubenswrapper[4721]: I0202 13:25:35.799097 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mlbxn" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" containerID="cri-o://c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a" gracePeriod=2 Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.541389 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.625916 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.662694 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data\") pod \"ce072a84-75da-4060-9c4a-d029b3a14947\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.664198 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxc9d\" (UniqueName: \"kubernetes.io/projected/ce072a84-75da-4060-9c4a-d029b3a14947-kube-api-access-wxc9d\") pod \"ce072a84-75da-4060-9c4a-d029b3a14947\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.664484 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-combined-ca-bundle\") pod \"ce072a84-75da-4060-9c4a-d029b3a14947\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.664702 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-catalog-content\") pod \"37372b76-ef54-4a44-9b56-dea754373219\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.664825 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data-custom\") pod \"ce072a84-75da-4060-9c4a-d029b3a14947\" (UID: \"ce072a84-75da-4060-9c4a-d029b3a14947\") " Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.664998 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmwvt\" (UniqueName: \"kubernetes.io/projected/37372b76-ef54-4a44-9b56-dea754373219-kube-api-access-dmwvt\") pod \"37372b76-ef54-4a44-9b56-dea754373219\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.665140 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-utilities\") pod \"37372b76-ef54-4a44-9b56-dea754373219\" (UID: \"37372b76-ef54-4a44-9b56-dea754373219\") " Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.666497 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-utilities" (OuterVolumeSpecName: "utilities") pod "37372b76-ef54-4a44-9b56-dea754373219" (UID: "37372b76-ef54-4a44-9b56-dea754373219"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.672951 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ce072a84-75da-4060-9c4a-d029b3a14947" (UID: "ce072a84-75da-4060-9c4a-d029b3a14947"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.677256 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37372b76-ef54-4a44-9b56-dea754373219-kube-api-access-dmwvt" (OuterVolumeSpecName: "kube-api-access-dmwvt") pod "37372b76-ef54-4a44-9b56-dea754373219" (UID: "37372b76-ef54-4a44-9b56-dea754373219"). InnerVolumeSpecName "kube-api-access-dmwvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.681143 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce072a84-75da-4060-9c4a-d029b3a14947-kube-api-access-wxc9d" (OuterVolumeSpecName: "kube-api-access-wxc9d") pod "ce072a84-75da-4060-9c4a-d029b3a14947" (UID: "ce072a84-75da-4060-9c4a-d029b3a14947"). InnerVolumeSpecName "kube-api-access-wxc9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.769727 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce072a84-75da-4060-9c4a-d029b3a14947" (UID: "ce072a84-75da-4060-9c4a-d029b3a14947"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.774343 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxc9d\" (UniqueName: \"kubernetes.io/projected/ce072a84-75da-4060-9c4a-d029b3a14947-kube-api-access-wxc9d\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.774404 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.774423 4721 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.774440 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmwvt\" (UniqueName: \"kubernetes.io/projected/37372b76-ef54-4a44-9b56-dea754373219-kube-api-access-dmwvt\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.774457 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.808719 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data" (OuterVolumeSpecName: "config-data") pod "ce072a84-75da-4060-9c4a-d029b3a14947" (UID: "ce072a84-75da-4060-9c4a-d029b3a14947"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.825553 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37372b76-ef54-4a44-9b56-dea754373219" (UID: "37372b76-ef54-4a44-9b56-dea754373219"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.830515 4721 generic.go:334] "Generic (PLEG): container finished" podID="ce072a84-75da-4060-9c4a-d029b3a14947" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" exitCode=0 Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.830722 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-69599c8b5f-rjs76" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.831112 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-69599c8b5f-rjs76" event={"ID":"ce072a84-75da-4060-9c4a-d029b3a14947","Type":"ContainerDied","Data":"824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062"} Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.831163 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-69599c8b5f-rjs76" event={"ID":"ce072a84-75da-4060-9c4a-d029b3a14947","Type":"ContainerDied","Data":"c5aaadf4a39fbac2caf4ce2bae03ec472daa951b771c303d721f863e7147d5f2"} Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.831181 4721 scope.go:117] "RemoveContainer" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.871486 4721 generic.go:334] "Generic (PLEG): container finished" podID="37372b76-ef54-4a44-9b56-dea754373219" containerID="c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a" exitCode=0 Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.872119 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlbxn" event={"ID":"37372b76-ef54-4a44-9b56-dea754373219","Type":"ContainerDied","Data":"c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a"} Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.872246 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mlbxn" event={"ID":"37372b76-ef54-4a44-9b56-dea754373219","Type":"ContainerDied","Data":"527e8468434eea06358e3d4622c114662919b1ba98ef618fb71f16dfc7759e5a"} Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.872438 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mlbxn" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.876829 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce072a84-75da-4060-9c4a-d029b3a14947-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.876865 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37372b76-ef54-4a44-9b56-dea754373219-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.923365 4721 scope.go:117] "RemoveContainer" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" Feb 02 13:25:36 crc kubenswrapper[4721]: E0202 13:25:36.934466 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062\": container with ID starting with 824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062 not found: ID does not exist" containerID="824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.934526 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062"} err="failed to get container status \"824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062\": rpc error: code = NotFound desc = could not find container \"824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062\": container with ID starting with 824ac82da4e9fbe898c73f52200c686a08effd8f70bdcb2e1cbad03807a59062 not found: ID does not exist" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.934558 4721 scope.go:117] "RemoveContainer" containerID="c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a" Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.936402 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-69599c8b5f-rjs76"] Feb 02 13:25:36 crc kubenswrapper[4721]: I0202 13:25:36.949794 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-69599c8b5f-rjs76"] Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.018895 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mlbxn"] Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.027278 4721 scope.go:117] "RemoveContainer" containerID="e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a" Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.029625 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mlbxn"] Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.083270 4721 scope.go:117] "RemoveContainer" containerID="be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981" Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.160492 4721 scope.go:117] "RemoveContainer" containerID="c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a" Feb 02 13:25:37 crc kubenswrapper[4721]: E0202 13:25:37.162680 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a\": container with ID starting with c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a not found: ID does not exist" containerID="c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a" Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.162716 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a"} err="failed to get container status \"c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a\": rpc error: code = NotFound desc = could not find container \"c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a\": container with ID starting with c5c7de4e95c2706b26ef16b1a2612a77aaede7e751f9c5dde67c52ea9832ec9a not found: ID does not exist" Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.162739 4721 scope.go:117] "RemoveContainer" containerID="e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a" Feb 02 13:25:37 crc kubenswrapper[4721]: E0202 13:25:37.163370 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a\": container with ID starting with e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a not found: ID does not exist" containerID="e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a" Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.163394 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a"} err="failed to get container status \"e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a\": rpc error: code = NotFound desc = could not find container \"e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a\": container with ID starting with e7fd605be804cd647b2eac1ab49ff2a82cd119fbc55b1ed46d82b6f010b0c40a not found: ID does not exist" Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.163410 4721 scope.go:117] "RemoveContainer" containerID="be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981" Feb 02 13:25:37 crc kubenswrapper[4721]: E0202 13:25:37.163981 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981\": container with ID starting with be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981 not found: ID does not exist" containerID="be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981" Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.164002 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981"} err="failed to get container status \"be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981\": rpc error: code = NotFound desc = could not find container \"be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981\": container with ID starting with be431440dec21df753f3a84f641990a0bc2d2448dae27123ad4db374fc8f8981 not found: ID does not exist" Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.888965 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlf4" event={"ID":"f1487506-5263-4ffe-b3e0-1a7a507590f9","Type":"ContainerStarted","Data":"8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5"} Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.899430 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerStarted","Data":"3b0302d825319ad93ac4b7a0c298d8bf5cb6775771000ecd18cf0462db227389"} Feb 02 13:25:37 crc kubenswrapper[4721]: I0202 13:25:37.924365 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bjlf4" podStartSLOduration=7.933875262 podStartE2EDuration="16.924339051s" podCreationTimestamp="2026-02-02 13:25:21 +0000 UTC" firstStartedPulling="2026-02-02 13:25:27.45931978 +0000 UTC m=+1467.761834169" lastFinishedPulling="2026-02-02 13:25:36.449783569 +0000 UTC m=+1476.752297958" observedRunningTime="2026-02-02 13:25:37.907469503 +0000 UTC m=+1478.209983912" watchObservedRunningTime="2026-02-02 13:25:37.924339051 +0000 UTC m=+1478.226853450" Feb 02 13:25:38 crc kubenswrapper[4721]: I0202 13:25:38.441458 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37372b76-ef54-4a44-9b56-dea754373219" path="/var/lib/kubelet/pods/37372b76-ef54-4a44-9b56-dea754373219/volumes" Feb 02 13:25:38 crc kubenswrapper[4721]: I0202 13:25:38.446442 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce072a84-75da-4060-9c4a-d029b3a14947" path="/var/lib/kubelet/pods/ce072a84-75da-4060-9c4a-d029b3a14947/volumes" Feb 02 13:25:38 crc kubenswrapper[4721]: I0202 13:25:38.650905 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:38 crc kubenswrapper[4721]: I0202 13:25:38.653089 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 13:25:38 crc kubenswrapper[4721]: I0202 13:25:38.653263 4721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 13:25:38 crc kubenswrapper[4721]: I0202 13:25:38.654824 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 13:25:38 crc kubenswrapper[4721]: I0202 13:25:38.654879 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 13:25:39 crc kubenswrapper[4721]: I0202 13:25:39.929922 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerStarted","Data":"b559fdc4b6fca20276abf4dceba9979868b3f8f7cc76c9ba5d01cbaddb599222"} Feb 02 13:25:39 crc kubenswrapper[4721]: I0202 13:25:39.930173 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="ceilometer-central-agent" containerID="cri-o://250d61da9c9aec53fa8403200cdb7bf07e9fd015ad372194d53e43f3edd63adb" gracePeriod=30 Feb 02 13:25:39 crc kubenswrapper[4721]: I0202 13:25:39.930215 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="sg-core" containerID="cri-o://3b0302d825319ad93ac4b7a0c298d8bf5cb6775771000ecd18cf0462db227389" gracePeriod=30 Feb 02 13:25:39 crc kubenswrapper[4721]: I0202 13:25:39.930249 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="proxy-httpd" containerID="cri-o://b559fdc4b6fca20276abf4dceba9979868b3f8f7cc76c9ba5d01cbaddb599222" gracePeriod=30 Feb 02 13:25:39 crc kubenswrapper[4721]: I0202 13:25:39.930286 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="ceilometer-notification-agent" containerID="cri-o://b0b5571eb117a42c85d555cc9f73ebf04bc2152bc212ed2d51da96cbfa4c0649" gracePeriod=30 Feb 02 13:25:39 crc kubenswrapper[4721]: I0202 13:25:39.930508 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 13:25:39 crc kubenswrapper[4721]: I0202 13:25:39.958916 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.120751072 podStartE2EDuration="11.95889688s" podCreationTimestamp="2026-02-02 13:25:28 +0000 UTC" firstStartedPulling="2026-02-02 13:25:30.256487452 +0000 UTC m=+1470.559001841" lastFinishedPulling="2026-02-02 13:25:39.09463326 +0000 UTC m=+1479.397147649" observedRunningTime="2026-02-02 13:25:39.953623777 +0000 UTC m=+1480.256138176" watchObservedRunningTime="2026-02-02 13:25:39.95889688 +0000 UTC m=+1480.261411269" Feb 02 13:25:40 crc kubenswrapper[4721]: I0202 13:25:40.948660 4721 generic.go:334] "Generic (PLEG): container finished" podID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerID="3b0302d825319ad93ac4b7a0c298d8bf5cb6775771000ecd18cf0462db227389" exitCode=2 Feb 02 13:25:40 crc kubenswrapper[4721]: I0202 13:25:40.948930 4721 generic.go:334] "Generic (PLEG): container finished" podID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerID="b0b5571eb117a42c85d555cc9f73ebf04bc2152bc212ed2d51da96cbfa4c0649" exitCode=0 Feb 02 13:25:40 crc kubenswrapper[4721]: I0202 13:25:40.948958 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerDied","Data":"3b0302d825319ad93ac4b7a0c298d8bf5cb6775771000ecd18cf0462db227389"} Feb 02 13:25:40 crc kubenswrapper[4721]: I0202 13:25:40.948991 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerDied","Data":"b0b5571eb117a42c85d555cc9f73ebf04bc2152bc212ed2d51da96cbfa4c0649"} Feb 02 13:25:42 crc kubenswrapper[4721]: I0202 13:25:42.125896 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:42 crc kubenswrapper[4721]: I0202 13:25:42.126288 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:43 crc kubenswrapper[4721]: I0202 13:25:43.212465 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bjlf4" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerName="registry-server" probeResult="failure" output=< Feb 02 13:25:43 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:25:43 crc kubenswrapper[4721]: > Feb 02 13:25:44 crc kubenswrapper[4721]: I0202 13:25:44.763532 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:25:44 crc kubenswrapper[4721]: I0202 13:25:44.763936 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:25:44 crc kubenswrapper[4721]: I0202 13:25:44.763984 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:25:44 crc kubenswrapper[4721]: I0202 13:25:44.764863 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c89d3977af7fbb3779c0661dadea0111ac2d8f3c3974c534b682ad6a4af4aac"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:25:44 crc kubenswrapper[4721]: I0202 13:25:44.764907 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://4c89d3977af7fbb3779c0661dadea0111ac2d8f3c3974c534b682ad6a4af4aac" gracePeriod=600 Feb 02 13:25:45 crc kubenswrapper[4721]: I0202 13:25:45.004844 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="4c89d3977af7fbb3779c0661dadea0111ac2d8f3c3974c534b682ad6a4af4aac" exitCode=0 Feb 02 13:25:45 crc kubenswrapper[4721]: I0202 13:25:45.004884 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"4c89d3977af7fbb3779c0661dadea0111ac2d8f3c3974c534b682ad6a4af4aac"} Feb 02 13:25:45 crc kubenswrapper[4721]: I0202 13:25:45.004915 4721 scope.go:117] "RemoveContainer" containerID="56e02e958f304734b98b90c0b35547a7aaeb3ba27ad6cd35ef754f549abd2513" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.056015 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c"} Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.058467 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9h27j" event={"ID":"f01b253a-c7c6-4c9e-a800-a1732ba06f37","Type":"ContainerStarted","Data":"8b1af4c2f51b92e7d8e686eda8c46c518ebd4e4b694ff77617f39b7b376a484a"} Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.100315 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9h27j" podStartSLOduration=2.8789096130000003 podStartE2EDuration="18.100292093s" podCreationTimestamp="2026-02-02 13:25:31 +0000 UTC" firstStartedPulling="2026-02-02 13:25:32.752278969 +0000 UTC m=+1473.054793368" lastFinishedPulling="2026-02-02 13:25:47.973661459 +0000 UTC m=+1488.276175848" observedRunningTime="2026-02-02 13:25:49.090641711 +0000 UTC m=+1489.393156100" watchObservedRunningTime="2026-02-02 13:25:49.100292093 +0000 UTC m=+1489.402806502" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.881790 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6vl65"] Feb 02 13:25:49 crc kubenswrapper[4721]: E0202 13:25:49.883237 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.883262 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" Feb 02 13:25:49 crc kubenswrapper[4721]: E0202 13:25:49.883283 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="extract-utilities" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.883292 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="extract-utilities" Feb 02 13:25:49 crc kubenswrapper[4721]: E0202 13:25:49.883322 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce072a84-75da-4060-9c4a-d029b3a14947" containerName="heat-engine" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.883332 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce072a84-75da-4060-9c4a-d029b3a14947" containerName="heat-engine" Feb 02 13:25:49 crc kubenswrapper[4721]: E0202 13:25:49.883365 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="extract-content" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.883376 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="extract-content" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.883639 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce072a84-75da-4060-9c4a-d029b3a14947" containerName="heat-engine" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.883671 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="37372b76-ef54-4a44-9b56-dea754373219" containerName="registry-server" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.885773 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:49 crc kubenswrapper[4721]: I0202 13:25:49.900733 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vl65"] Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.015536 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-utilities\") pod \"certified-operators-6vl65\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.015642 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-catalog-content\") pod \"certified-operators-6vl65\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.015781 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6ncw\" (UniqueName: \"kubernetes.io/projected/c2db9546-309a-4e2a-8b57-e2986e6cb500-kube-api-access-b6ncw\") pod \"certified-operators-6vl65\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.117530 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6ncw\" (UniqueName: \"kubernetes.io/projected/c2db9546-309a-4e2a-8b57-e2986e6cb500-kube-api-access-b6ncw\") pod \"certified-operators-6vl65\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.117638 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-utilities\") pod \"certified-operators-6vl65\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.117709 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-catalog-content\") pod \"certified-operators-6vl65\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.118431 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-catalog-content\") pod \"certified-operators-6vl65\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.118701 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-utilities\") pod \"certified-operators-6vl65\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.149326 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6ncw\" (UniqueName: \"kubernetes.io/projected/c2db9546-309a-4e2a-8b57-e2986e6cb500-kube-api-access-b6ncw\") pod \"certified-operators-6vl65\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.204805 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:25:50 crc kubenswrapper[4721]: W0202 13:25:50.767701 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2db9546_309a_4e2a_8b57_e2986e6cb500.slice/crio-ae96317dbbd9659e948a12834719912af10f2e2c9b3f171a99e46548715f116d WatchSource:0}: Error finding container ae96317dbbd9659e948a12834719912af10f2e2c9b3f171a99e46548715f116d: Status 404 returned error can't find the container with id ae96317dbbd9659e948a12834719912af10f2e2c9b3f171a99e46548715f116d Feb 02 13:25:50 crc kubenswrapper[4721]: I0202 13:25:50.779913 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vl65"] Feb 02 13:25:51 crc kubenswrapper[4721]: I0202 13:25:51.086942 4721 generic.go:334] "Generic (PLEG): container finished" podID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerID="214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169" exitCode=0 Feb 02 13:25:51 crc kubenswrapper[4721]: I0202 13:25:51.087041 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vl65" event={"ID":"c2db9546-309a-4e2a-8b57-e2986e6cb500","Type":"ContainerDied","Data":"214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169"} Feb 02 13:25:51 crc kubenswrapper[4721]: I0202 13:25:51.087339 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vl65" event={"ID":"c2db9546-309a-4e2a-8b57-e2986e6cb500","Type":"ContainerStarted","Data":"ae96317dbbd9659e948a12834719912af10f2e2c9b3f171a99e46548715f116d"} Feb 02 13:25:51 crc kubenswrapper[4721]: I0202 13:25:51.090927 4721 generic.go:334] "Generic (PLEG): container finished" podID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerID="250d61da9c9aec53fa8403200cdb7bf07e9fd015ad372194d53e43f3edd63adb" exitCode=0 Feb 02 13:25:51 crc kubenswrapper[4721]: I0202 13:25:51.090970 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerDied","Data":"250d61da9c9aec53fa8403200cdb7bf07e9fd015ad372194d53e43f3edd63adb"} Feb 02 13:25:52 crc kubenswrapper[4721]: I0202 13:25:52.173911 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:52 crc kubenswrapper[4721]: I0202 13:25:52.237102 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:53 crc kubenswrapper[4721]: I0202 13:25:53.127237 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vl65" event={"ID":"c2db9546-309a-4e2a-8b57-e2986e6cb500","Type":"ContainerStarted","Data":"1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4"} Feb 02 13:25:53 crc kubenswrapper[4721]: I0202 13:25:53.849812 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlf4"] Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.134461 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bjlf4" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerName="registry-server" containerID="cri-o://8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5" gracePeriod=2 Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.700921 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.739203 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-catalog-content\") pod \"f1487506-5263-4ffe-b3e0-1a7a507590f9\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.739256 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-utilities\") pod \"f1487506-5263-4ffe-b3e0-1a7a507590f9\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.739486 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvkl7\" (UniqueName: \"kubernetes.io/projected/f1487506-5263-4ffe-b3e0-1a7a507590f9-kube-api-access-rvkl7\") pod \"f1487506-5263-4ffe-b3e0-1a7a507590f9\" (UID: \"f1487506-5263-4ffe-b3e0-1a7a507590f9\") " Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.740271 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-utilities" (OuterVolumeSpecName: "utilities") pod "f1487506-5263-4ffe-b3e0-1a7a507590f9" (UID: "f1487506-5263-4ffe-b3e0-1a7a507590f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.764977 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1487506-5263-4ffe-b3e0-1a7a507590f9" (UID: "f1487506-5263-4ffe-b3e0-1a7a507590f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.765799 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1487506-5263-4ffe-b3e0-1a7a507590f9-kube-api-access-rvkl7" (OuterVolumeSpecName: "kube-api-access-rvkl7") pod "f1487506-5263-4ffe-b3e0-1a7a507590f9" (UID: "f1487506-5263-4ffe-b3e0-1a7a507590f9"). InnerVolumeSpecName "kube-api-access-rvkl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.842052 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.842119 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1487506-5263-4ffe-b3e0-1a7a507590f9-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:54 crc kubenswrapper[4721]: I0202 13:25:54.842129 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvkl7\" (UniqueName: \"kubernetes.io/projected/f1487506-5263-4ffe-b3e0-1a7a507590f9-kube-api-access-rvkl7\") on node \"crc\" DevicePath \"\"" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.150175 4721 generic.go:334] "Generic (PLEG): container finished" podID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerID="8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5" exitCode=0 Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.150215 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlf4" event={"ID":"f1487506-5263-4ffe-b3e0-1a7a507590f9","Type":"ContainerDied","Data":"8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5"} Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.150240 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bjlf4" event={"ID":"f1487506-5263-4ffe-b3e0-1a7a507590f9","Type":"ContainerDied","Data":"baa4f2b8c7f38c357e7fd8029e914fb27779853104aeda245f29bdec34cc4815"} Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.150257 4721 scope.go:117] "RemoveContainer" containerID="8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.150388 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bjlf4" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.174349 4721 scope.go:117] "RemoveContainer" containerID="286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.194030 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlf4"] Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.207971 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bjlf4"] Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.214113 4721 scope.go:117] "RemoveContainer" containerID="4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d" Feb 02 13:25:55 crc kubenswrapper[4721]: E0202 13:25:55.329822 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1487506_5263_4ffe_b3e0_1a7a507590f9.slice\": RecentStats: unable to find data in memory cache]" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.344229 4721 scope.go:117] "RemoveContainer" containerID="8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5" Feb 02 13:25:55 crc kubenswrapper[4721]: E0202 13:25:55.349781 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5\": container with ID starting with 8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5 not found: ID does not exist" containerID="8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.349832 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5"} err="failed to get container status \"8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5\": rpc error: code = NotFound desc = could not find container \"8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5\": container with ID starting with 8abc226211c9966989623516d814dc227ce6e9431d716262c64d61c953cd9de5 not found: ID does not exist" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.349859 4721 scope.go:117] "RemoveContainer" containerID="286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d" Feb 02 13:25:55 crc kubenswrapper[4721]: E0202 13:25:55.350560 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d\": container with ID starting with 286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d not found: ID does not exist" containerID="286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.350598 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d"} err="failed to get container status \"286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d\": rpc error: code = NotFound desc = could not find container \"286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d\": container with ID starting with 286f545d594e5abd14663ecadd58ca5f7b2093d194dddbbe4cdbc1b3011f703d not found: ID does not exist" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.350624 4721 scope.go:117] "RemoveContainer" containerID="4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d" Feb 02 13:25:55 crc kubenswrapper[4721]: E0202 13:25:55.351048 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d\": container with ID starting with 4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d not found: ID does not exist" containerID="4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d" Feb 02 13:25:55 crc kubenswrapper[4721]: I0202 13:25:55.351116 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d"} err="failed to get container status \"4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d\": rpc error: code = NotFound desc = could not find container \"4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d\": container with ID starting with 4e9ec40449c9a13570d177a0fa260a694310bc1d1b70021291444805d29dc83d not found: ID does not exist" Feb 02 13:25:56 crc kubenswrapper[4721]: I0202 13:25:56.174988 4721 generic.go:334] "Generic (PLEG): container finished" podID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerID="1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4" exitCode=0 Feb 02 13:25:56 crc kubenswrapper[4721]: I0202 13:25:56.175106 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vl65" event={"ID":"c2db9546-309a-4e2a-8b57-e2986e6cb500","Type":"ContainerDied","Data":"1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4"} Feb 02 13:25:56 crc kubenswrapper[4721]: I0202 13:25:56.423729 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" path="/var/lib/kubelet/pods/f1487506-5263-4ffe-b3e0-1a7a507590f9/volumes" Feb 02 13:25:57 crc kubenswrapper[4721]: I0202 13:25:57.193135 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vl65" event={"ID":"c2db9546-309a-4e2a-8b57-e2986e6cb500","Type":"ContainerStarted","Data":"34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917"} Feb 02 13:25:57 crc kubenswrapper[4721]: I0202 13:25:57.210688 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6vl65" podStartSLOduration=2.585467734 podStartE2EDuration="8.210671665s" podCreationTimestamp="2026-02-02 13:25:49 +0000 UTC" firstStartedPulling="2026-02-02 13:25:51.088983209 +0000 UTC m=+1491.391497598" lastFinishedPulling="2026-02-02 13:25:56.71418714 +0000 UTC m=+1497.016701529" observedRunningTime="2026-02-02 13:25:57.209630087 +0000 UTC m=+1497.512144486" watchObservedRunningTime="2026-02-02 13:25:57.210671665 +0000 UTC m=+1497.513186054" Feb 02 13:25:59 crc kubenswrapper[4721]: I0202 13:25:59.097680 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 13:26:00 crc kubenswrapper[4721]: I0202 13:26:00.206104 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:26:00 crc kubenswrapper[4721]: I0202 13:26:00.206390 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:26:00 crc kubenswrapper[4721]: I0202 13:26:00.265950 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:26:04 crc kubenswrapper[4721]: I0202 13:26:04.266042 4721 generic.go:334] "Generic (PLEG): container finished" podID="f01b253a-c7c6-4c9e-a800-a1732ba06f37" containerID="8b1af4c2f51b92e7d8e686eda8c46c518ebd4e4b694ff77617f39b7b376a484a" exitCode=0 Feb 02 13:26:04 crc kubenswrapper[4721]: I0202 13:26:04.266129 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9h27j" event={"ID":"f01b253a-c7c6-4c9e-a800-a1732ba06f37","Type":"ContainerDied","Data":"8b1af4c2f51b92e7d8e686eda8c46c518ebd4e4b694ff77617f39b7b376a484a"} Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.711104 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.808824 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-config-data\") pod \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.809095 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-combined-ca-bundle\") pod \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.809137 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc8q5\" (UniqueName: \"kubernetes.io/projected/f01b253a-c7c6-4c9e-a800-a1732ba06f37-kube-api-access-zc8q5\") pod \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.809158 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-scripts\") pod \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\" (UID: \"f01b253a-c7c6-4c9e-a800-a1732ba06f37\") " Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.815022 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01b253a-c7c6-4c9e-a800-a1732ba06f37-kube-api-access-zc8q5" (OuterVolumeSpecName: "kube-api-access-zc8q5") pod "f01b253a-c7c6-4c9e-a800-a1732ba06f37" (UID: "f01b253a-c7c6-4c9e-a800-a1732ba06f37"). InnerVolumeSpecName "kube-api-access-zc8q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.815378 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-scripts" (OuterVolumeSpecName: "scripts") pod "f01b253a-c7c6-4c9e-a800-a1732ba06f37" (UID: "f01b253a-c7c6-4c9e-a800-a1732ba06f37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.847916 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f01b253a-c7c6-4c9e-a800-a1732ba06f37" (UID: "f01b253a-c7c6-4c9e-a800-a1732ba06f37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.848948 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-config-data" (OuterVolumeSpecName: "config-data") pod "f01b253a-c7c6-4c9e-a800-a1732ba06f37" (UID: "f01b253a-c7c6-4c9e-a800-a1732ba06f37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.911722 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.911766 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.911781 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc8q5\" (UniqueName: \"kubernetes.io/projected/f01b253a-c7c6-4c9e-a800-a1732ba06f37-kube-api-access-zc8q5\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:05 crc kubenswrapper[4721]: I0202 13:26:05.911793 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f01b253a-c7c6-4c9e-a800-a1732ba06f37-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.287804 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9h27j" event={"ID":"f01b253a-c7c6-4c9e-a800-a1732ba06f37","Type":"ContainerDied","Data":"5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6"} Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.287834 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9h27j" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.287851 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.457490 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 13:26:06 crc kubenswrapper[4721]: E0202 13:26:06.458016 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerName="extract-content" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.458032 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerName="extract-content" Feb 02 13:26:06 crc kubenswrapper[4721]: E0202 13:26:06.458052 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerName="extract-utilities" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.458058 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerName="extract-utilities" Feb 02 13:26:06 crc kubenswrapper[4721]: E0202 13:26:06.458109 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerName="registry-server" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.458116 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerName="registry-server" Feb 02 13:26:06 crc kubenswrapper[4721]: E0202 13:26:06.458131 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01b253a-c7c6-4c9e-a800-a1732ba06f37" containerName="nova-cell0-conductor-db-sync" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.458137 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01b253a-c7c6-4c9e-a800-a1732ba06f37" containerName="nova-cell0-conductor-db-sync" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.458341 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1487506-5263-4ffe-b3e0-1a7a507590f9" containerName="registry-server" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.458362 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01b253a-c7c6-4c9e-a800-a1732ba06f37" containerName="nova-cell0-conductor-db-sync" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.459171 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.461300 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mq69d" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.461498 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.468426 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.626001 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrb8t\" (UniqueName: \"kubernetes.io/projected/562aee22-e2a0-4706-b65a-7e9398823dec-kube-api-access-nrb8t\") pod \"nova-cell0-conductor-0\" (UID: \"562aee22-e2a0-4706-b65a-7e9398823dec\") " pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.626475 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562aee22-e2a0-4706-b65a-7e9398823dec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"562aee22-e2a0-4706-b65a-7e9398823dec\") " pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.626655 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/562aee22-e2a0-4706-b65a-7e9398823dec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"562aee22-e2a0-4706-b65a-7e9398823dec\") " pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.729571 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562aee22-e2a0-4706-b65a-7e9398823dec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"562aee22-e2a0-4706-b65a-7e9398823dec\") " pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.729701 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/562aee22-e2a0-4706-b65a-7e9398823dec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"562aee22-e2a0-4706-b65a-7e9398823dec\") " pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.729902 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrb8t\" (UniqueName: \"kubernetes.io/projected/562aee22-e2a0-4706-b65a-7e9398823dec-kube-api-access-nrb8t\") pod \"nova-cell0-conductor-0\" (UID: \"562aee22-e2a0-4706-b65a-7e9398823dec\") " pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.734789 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/562aee22-e2a0-4706-b65a-7e9398823dec-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"562aee22-e2a0-4706-b65a-7e9398823dec\") " pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.736223 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562aee22-e2a0-4706-b65a-7e9398823dec-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"562aee22-e2a0-4706-b65a-7e9398823dec\") " pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.758600 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrb8t\" (UniqueName: \"kubernetes.io/projected/562aee22-e2a0-4706-b65a-7e9398823dec-kube-api-access-nrb8t\") pod \"nova-cell0-conductor-0\" (UID: \"562aee22-e2a0-4706-b65a-7e9398823dec\") " pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:06 crc kubenswrapper[4721]: I0202 13:26:06.798748 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:07 crc kubenswrapper[4721]: I0202 13:26:07.272492 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 13:26:07 crc kubenswrapper[4721]: I0202 13:26:07.298925 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"562aee22-e2a0-4706-b65a-7e9398823dec","Type":"ContainerStarted","Data":"ffecfe3f218a2f41dffa98466ead7a510067b259e94cd90543b21b12af58ebc1"} Feb 02 13:26:08 crc kubenswrapper[4721]: I0202 13:26:08.309917 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"562aee22-e2a0-4706-b65a-7e9398823dec","Type":"ContainerStarted","Data":"fbd59eeadba9cd8dbe6ae843c75399a80ca2280eb621f91ea7107f191702cb00"} Feb 02 13:26:08 crc kubenswrapper[4721]: I0202 13:26:08.310290 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:08 crc kubenswrapper[4721]: I0202 13:26:08.338271 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.338253405 podStartE2EDuration="2.338253405s" podCreationTimestamp="2026-02-02 13:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:08.328982485 +0000 UTC m=+1508.631496894" watchObservedRunningTime="2026-02-02 13:26:08.338253405 +0000 UTC m=+1508.640767794" Feb 02 13:26:09 crc kubenswrapper[4721]: E0202 13:26:09.997135 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:10 crc kubenswrapper[4721]: E0202 13:26:10.053103 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.204296 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.307386 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.330973 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data-custom\") pod \"bddd12fa-0653-4199-867f-bfdf51350b39\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.331032 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-combined-ca-bundle\") pod \"bddd12fa-0653-4199-867f-bfdf51350b39\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.331108 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jwmm\" (UniqueName: \"kubernetes.io/projected/bddd12fa-0653-4199-867f-bfdf51350b39-kube-api-access-6jwmm\") pod \"bddd12fa-0653-4199-867f-bfdf51350b39\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.331279 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data\") pod \"bddd12fa-0653-4199-867f-bfdf51350b39\" (UID: \"bddd12fa-0653-4199-867f-bfdf51350b39\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.340315 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bddd12fa-0653-4199-867f-bfdf51350b39" (UID: "bddd12fa-0653-4199-867f-bfdf51350b39"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.351576 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bddd12fa-0653-4199-867f-bfdf51350b39-kube-api-access-6jwmm" (OuterVolumeSpecName: "kube-api-access-6jwmm") pod "bddd12fa-0653-4199-867f-bfdf51350b39" (UID: "bddd12fa-0653-4199-867f-bfdf51350b39"). InnerVolumeSpecName "kube-api-access-6jwmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.398323 4721 generic.go:334] "Generic (PLEG): container finished" podID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerID="b559fdc4b6fca20276abf4dceba9979868b3f8f7cc76c9ba5d01cbaddb599222" exitCode=137 Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.398441 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerDied","Data":"b559fdc4b6fca20276abf4dceba9979868b3f8f7cc76c9ba5d01cbaddb599222"} Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.426603 4721 generic.go:334] "Generic (PLEG): container finished" podID="bddd12fa-0653-4199-867f-bfdf51350b39" containerID="c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75" exitCode=137 Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.438399 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.438956 4721 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.441498 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jwmm\" (UniqueName: \"kubernetes.io/projected/bddd12fa-0653-4199-867f-bfdf51350b39-kube-api-access-6jwmm\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.486035 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bddd12fa-0653-4199-867f-bfdf51350b39" (UID: "bddd12fa-0653-4199-867f-bfdf51350b39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.495927 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" event={"ID":"bddd12fa-0653-4199-867f-bfdf51350b39","Type":"ContainerDied","Data":"c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75"} Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.495988 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-59bff7fb85-wq6q5" event={"ID":"bddd12fa-0653-4199-867f-bfdf51350b39","Type":"ContainerDied","Data":"0f9d2ecaa1c8c841d9801b80ea33e35c8c2a2c815bca71770000a6385ed7be15"} Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.496008 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vl65"] Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.496290 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6vl65" podUID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerName="registry-server" containerID="cri-o://34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917" gracePeriod=2 Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.496413 4721 scope.go:117] "RemoveContainer" containerID="c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.541079 4721 scope.go:117] "RemoveContainer" containerID="c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75" Feb 02 13:26:10 crc kubenswrapper[4721]: E0202 13:26:10.544756 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75\": container with ID starting with c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75 not found: ID does not exist" containerID="c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.544805 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75"} err="failed to get container status \"c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75\": rpc error: code = NotFound desc = could not find container \"c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75\": container with ID starting with c89c6bf878c74f41052b7958471a080d9fee687b5e9464975790c1897da44a75 not found: ID does not exist" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.546842 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.548953 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data" (OuterVolumeSpecName: "config-data") pod "bddd12fa-0653-4199-867f-bfdf51350b39" (UID: "bddd12fa-0653-4199-867f-bfdf51350b39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.644246 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.648921 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bddd12fa-0653-4199-867f-bfdf51350b39-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.750181 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-scripts\") pod \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.750282 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85frw\" (UniqueName: \"kubernetes.io/projected/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-kube-api-access-85frw\") pod \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.750437 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-run-httpd\") pod \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.750465 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-config-data\") pod \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.750534 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-combined-ca-bundle\") pod \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.750565 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-log-httpd\") pod \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.750669 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-sg-core-conf-yaml\") pod \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\" (UID: \"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3\") " Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.752822 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" (UID: "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.756321 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-kube-api-access-85frw" (OuterVolumeSpecName: "kube-api-access-85frw") pod "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" (UID: "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3"). InnerVolumeSpecName "kube-api-access-85frw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.756618 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-scripts" (OuterVolumeSpecName: "scripts") pod "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" (UID: "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.757346 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" (UID: "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.853851 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-59bff7fb85-wq6q5"] Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.859396 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" (UID: "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.863234 4721 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.863279 4721 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.863299 4721 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.863312 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.863323 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85frw\" (UniqueName: \"kubernetes.io/projected/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-kube-api-access-85frw\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.875188 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-59bff7fb85-wq6q5"] Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.967269 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" (UID: "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:10 crc kubenswrapper[4721]: I0202 13:26:10.973387 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.073057 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-config-data" (OuterVolumeSpecName: "config-data") pod "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" (UID: "d1170efb-7d4f-4c8b-84dd-6d86925bf5d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.075133 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.201681 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-gq5tv"] Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.202691 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="sg-core" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.202843 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="sg-core" Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.202934 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="proxy-httpd" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.203004 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="proxy-httpd" Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.203089 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bddd12fa-0653-4199-867f-bfdf51350b39" containerName="heat-cfnapi" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.203206 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="bddd12fa-0653-4199-867f-bfdf51350b39" containerName="heat-cfnapi" Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.203305 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="ceilometer-central-agent" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.203379 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="ceilometer-central-agent" Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.203491 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="ceilometer-notification-agent" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.203566 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="ceilometer-notification-agent" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.203910 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="ceilometer-central-agent" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.204010 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="bddd12fa-0653-4199-867f-bfdf51350b39" containerName="heat-cfnapi" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.204120 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="sg-core" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.204233 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="ceilometer-notification-agent" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.204335 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" containerName="proxy-httpd" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.205503 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.245646 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-gq5tv"] Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.254974 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.282378 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-2c26-account-create-update-2r4tb"] Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.283615 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerName="extract-utilities" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.283729 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerName="extract-utilities" Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.283838 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerName="registry-server" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.283932 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerName="registry-server" Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.284040 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerName="extract-content" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.284186 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerName="extract-content" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.284576 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerName="registry-server" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.285540 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.291152 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.304762 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-2c26-account-create-update-2r4tb"] Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.390520 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-catalog-content\") pod \"c2db9546-309a-4e2a-8b57-e2986e6cb500\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.390995 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-utilities\") pod \"c2db9546-309a-4e2a-8b57-e2986e6cb500\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.391203 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6ncw\" (UniqueName: \"kubernetes.io/projected/c2db9546-309a-4e2a-8b57-e2986e6cb500-kube-api-access-b6ncw\") pod \"c2db9546-309a-4e2a-8b57-e2986e6cb500\" (UID: \"c2db9546-309a-4e2a-8b57-e2986e6cb500\") " Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.391721 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/997707ef-4296-4151-9385-0fbb48b5e317-operator-scripts\") pod \"aodh-2c26-account-create-update-2r4tb\" (UID: \"997707ef-4296-4151-9385-0fbb48b5e317\") " pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.391916 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwrd\" (UniqueName: \"kubernetes.io/projected/997707ef-4296-4151-9385-0fbb48b5e317-kube-api-access-njwrd\") pod \"aodh-2c26-account-create-update-2r4tb\" (UID: \"997707ef-4296-4151-9385-0fbb48b5e317\") " pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.392000 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcbwm\" (UniqueName: \"kubernetes.io/projected/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-kube-api-access-jcbwm\") pod \"aodh-db-create-gq5tv\" (UID: \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\") " pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.392274 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-operator-scripts\") pod \"aodh-db-create-gq5tv\" (UID: \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\") " pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.393464 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-utilities" (OuterVolumeSpecName: "utilities") pod "c2db9546-309a-4e2a-8b57-e2986e6cb500" (UID: "c2db9546-309a-4e2a-8b57-e2986e6cb500"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.399354 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2db9546-309a-4e2a-8b57-e2986e6cb500-kube-api-access-b6ncw" (OuterVolumeSpecName: "kube-api-access-b6ncw") pod "c2db9546-309a-4e2a-8b57-e2986e6cb500" (UID: "c2db9546-309a-4e2a-8b57-e2986e6cb500"). InnerVolumeSpecName "kube-api-access-b6ncw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.455207 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2db9546-309a-4e2a-8b57-e2986e6cb500" (UID: "c2db9546-309a-4e2a-8b57-e2986e6cb500"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.459471 4721 generic.go:334] "Generic (PLEG): container finished" podID="c2db9546-309a-4e2a-8b57-e2986e6cb500" containerID="34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917" exitCode=0 Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.459554 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vl65" event={"ID":"c2db9546-309a-4e2a-8b57-e2986e6cb500","Type":"ContainerDied","Data":"34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917"} Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.459587 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vl65" event={"ID":"c2db9546-309a-4e2a-8b57-e2986e6cb500","Type":"ContainerDied","Data":"ae96317dbbd9659e948a12834719912af10f2e2c9b3f171a99e46548715f116d"} Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.459607 4721 scope.go:117] "RemoveContainer" containerID="34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.459772 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vl65" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.464465 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d1170efb-7d4f-4c8b-84dd-6d86925bf5d3","Type":"ContainerDied","Data":"17422626d1aef368e4a89ff57fcd151611e206eb400b70c99ece56e2a2b12dbd"} Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.464658 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.483722 4721 scope.go:117] "RemoveContainer" containerID="1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.494252 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/997707ef-4296-4151-9385-0fbb48b5e317-operator-scripts\") pod \"aodh-2c26-account-create-update-2r4tb\" (UID: \"997707ef-4296-4151-9385-0fbb48b5e317\") " pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.494303 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njwrd\" (UniqueName: \"kubernetes.io/projected/997707ef-4296-4151-9385-0fbb48b5e317-kube-api-access-njwrd\") pod \"aodh-2c26-account-create-update-2r4tb\" (UID: \"997707ef-4296-4151-9385-0fbb48b5e317\") " pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.494339 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcbwm\" (UniqueName: \"kubernetes.io/projected/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-kube-api-access-jcbwm\") pod \"aodh-db-create-gq5tv\" (UID: \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\") " pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.494536 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-operator-scripts\") pod \"aodh-db-create-gq5tv\" (UID: \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\") " pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.494632 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.494645 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2db9546-309a-4e2a-8b57-e2986e6cb500-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.494669 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6ncw\" (UniqueName: \"kubernetes.io/projected/c2db9546-309a-4e2a-8b57-e2986e6cb500-kube-api-access-b6ncw\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.496755 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/997707ef-4296-4151-9385-0fbb48b5e317-operator-scripts\") pod \"aodh-2c26-account-create-update-2r4tb\" (UID: \"997707ef-4296-4151-9385-0fbb48b5e317\") " pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.497182 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-operator-scripts\") pod \"aodh-db-create-gq5tv\" (UID: \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\") " pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.523891 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcbwm\" (UniqueName: \"kubernetes.io/projected/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-kube-api-access-jcbwm\") pod \"aodh-db-create-gq5tv\" (UID: \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\") " pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.529773 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwrd\" (UniqueName: \"kubernetes.io/projected/997707ef-4296-4151-9385-0fbb48b5e317-kube-api-access-njwrd\") pod \"aodh-2c26-account-create-update-2r4tb\" (UID: \"997707ef-4296-4151-9385-0fbb48b5e317\") " pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.546584 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.566324 4721 scope.go:117] "RemoveContainer" containerID="214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.567638 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.577825 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.604699 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.610302 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.611657 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.617825 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.618308 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.627552 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vl65"] Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.631767 4721 scope.go:117] "RemoveContainer" containerID="34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917" Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.632274 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917\": container with ID starting with 34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917 not found: ID does not exist" containerID="34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.632324 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917"} err="failed to get container status \"34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917\": rpc error: code = NotFound desc = could not find container \"34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917\": container with ID starting with 34f03665f21982d0db05900d73ae2a2969e1e103422293a6d050c2f0d4308917 not found: ID does not exist" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.632380 4721 scope.go:117] "RemoveContainer" containerID="1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4" Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.632789 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4\": container with ID starting with 1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4 not found: ID does not exist" containerID="1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.632831 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4"} err="failed to get container status \"1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4\": rpc error: code = NotFound desc = could not find container \"1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4\": container with ID starting with 1d340a91edc1a2d06152b7bbe13ebe7e5ae745bad0fc295760f0755f827254e4 not found: ID does not exist" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.632888 4721 scope.go:117] "RemoveContainer" containerID="214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169" Feb 02 13:26:11 crc kubenswrapper[4721]: E0202 13:26:11.635421 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169\": container with ID starting with 214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169 not found: ID does not exist" containerID="214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.635494 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169"} err="failed to get container status \"214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169\": rpc error: code = NotFound desc = could not find container \"214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169\": container with ID starting with 214137d2119b0f131d16421f0c5475602028f01ed014db4a693d69884c3bf169 not found: ID does not exist" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.635545 4721 scope.go:117] "RemoveContainer" containerID="b559fdc4b6fca20276abf4dceba9979868b3f8f7cc76c9ba5d01cbaddb599222" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.643913 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6vl65"] Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.658193 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.687902 4721 scope.go:117] "RemoveContainer" containerID="3b0302d825319ad93ac4b7a0c298d8bf5cb6775771000ecd18cf0462db227389" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.715757 4721 scope.go:117] "RemoveContainer" containerID="b0b5571eb117a42c85d555cc9f73ebf04bc2152bc212ed2d51da96cbfa4c0649" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.773850 4721 scope.go:117] "RemoveContainer" containerID="250d61da9c9aec53fa8403200cdb7bf07e9fd015ad372194d53e43f3edd63adb" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.801946 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-scripts\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.802035 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-config-data\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.802082 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-run-httpd\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.802127 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-log-httpd\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.802165 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.802196 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlqf6\" (UniqueName: \"kubernetes.io/projected/dd964db7-c2d3-477b-be71-60058c811541-kube-api-access-rlqf6\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.802275 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.908025 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlqf6\" (UniqueName: \"kubernetes.io/projected/dd964db7-c2d3-477b-be71-60058c811541-kube-api-access-rlqf6\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.908250 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.908310 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-scripts\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.908948 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-config-data\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.909099 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-run-httpd\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.909154 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-log-httpd\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.909198 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.915465 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.915885 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-run-httpd\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.916225 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-log-httpd\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.917175 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.921274 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-scripts\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.921971 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-config-data\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.926777 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlqf6\" (UniqueName: \"kubernetes.io/projected/dd964db7-c2d3-477b-be71-60058c811541-kube-api-access-rlqf6\") pod \"ceilometer-0\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " pod="openstack/ceilometer-0" Feb 02 13:26:11 crc kubenswrapper[4721]: I0202 13:26:11.935276 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:26:12 crc kubenswrapper[4721]: W0202 13:26:12.156315 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad02c2c3_e07d_4ab9_8498_26e3c2bfdfb9.slice/crio-db646f49ac24673585383b65fa497f105bd4cdfb471374da88118daa7739bca0 WatchSource:0}: Error finding container db646f49ac24673585383b65fa497f105bd4cdfb471374da88118daa7739bca0: Status 404 returned error can't find the container with id db646f49ac24673585383b65fa497f105bd4cdfb471374da88118daa7739bca0 Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.158643 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-gq5tv"] Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.322716 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-2c26-account-create-update-2r4tb"] Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.433759 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bddd12fa-0653-4199-867f-bfdf51350b39" path="/var/lib/kubelet/pods/bddd12fa-0653-4199-867f-bfdf51350b39/volumes" Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.436306 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2db9546-309a-4e2a-8b57-e2986e6cb500" path="/var/lib/kubelet/pods/c2db9546-309a-4e2a-8b57-e2986e6cb500/volumes" Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.438198 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1170efb-7d4f-4c8b-84dd-6d86925bf5d3" path="/var/lib/kubelet/pods/d1170efb-7d4f-4c8b-84dd-6d86925bf5d3/volumes" Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.504798 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gq5tv" event={"ID":"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9","Type":"ContainerStarted","Data":"2995f84a212135f6e3822d895963dca122d6d439797b7dcab3e6c3486dd7be70"} Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.505041 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gq5tv" event={"ID":"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9","Type":"ContainerStarted","Data":"db646f49ac24673585383b65fa497f105bd4cdfb471374da88118daa7739bca0"} Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.510126 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-2c26-account-create-update-2r4tb" event={"ID":"997707ef-4296-4151-9385-0fbb48b5e317","Type":"ContainerStarted","Data":"30f0205f517a6c8c56e5656abaa390af0c771c20a778135ec582e019bcd70c1e"} Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.527048 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:12 crc kubenswrapper[4721]: I0202 13:26:12.541633 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-gq5tv" podStartSLOduration=1.541607474 podStartE2EDuration="1.541607474s" podCreationTimestamp="2026-02-02 13:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:12.526525126 +0000 UTC m=+1512.829039515" watchObservedRunningTime="2026-02-02 13:26:12.541607474 +0000 UTC m=+1512.844121873" Feb 02 13:26:13 crc kubenswrapper[4721]: I0202 13:26:13.528658 4721 generic.go:334] "Generic (PLEG): container finished" podID="ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9" containerID="2995f84a212135f6e3822d895963dca122d6d439797b7dcab3e6c3486dd7be70" exitCode=0 Feb 02 13:26:13 crc kubenswrapper[4721]: I0202 13:26:13.528702 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gq5tv" event={"ID":"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9","Type":"ContainerDied","Data":"2995f84a212135f6e3822d895963dca122d6d439797b7dcab3e6c3486dd7be70"} Feb 02 13:26:13 crc kubenswrapper[4721]: I0202 13:26:13.536838 4721 generic.go:334] "Generic (PLEG): container finished" podID="997707ef-4296-4151-9385-0fbb48b5e317" containerID="1857f80c5a621ba0a1377e0d92d2bafe33fbe9b8df8cc9bde107743c0bafe96d" exitCode=0 Feb 02 13:26:13 crc kubenswrapper[4721]: I0202 13:26:13.536950 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-2c26-account-create-update-2r4tb" event={"ID":"997707ef-4296-4151-9385-0fbb48b5e317","Type":"ContainerDied","Data":"1857f80c5a621ba0a1377e0d92d2bafe33fbe9b8df8cc9bde107743c0bafe96d"} Feb 02 13:26:13 crc kubenswrapper[4721]: I0202 13:26:13.540850 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerStarted","Data":"941cea547a9c6139f4cd34b3cba1a9232469327b41a70627da4554d99e83c28b"} Feb 02 13:26:13 crc kubenswrapper[4721]: I0202 13:26:13.540900 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerStarted","Data":"72f8e4545197a36201cfdd4ca1128b7b9319c774b1a3e5c0ba44e89faa242906"} Feb 02 13:26:14 crc kubenswrapper[4721]: I0202 13:26:14.556552 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerStarted","Data":"3fc0efffdb16822dab9442b0b34ece997fbe26ebe5808aadd9adb66a693a7dd5"} Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.136841 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.147245 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.323464 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcbwm\" (UniqueName: \"kubernetes.io/projected/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-kube-api-access-jcbwm\") pod \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\" (UID: \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\") " Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.323538 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njwrd\" (UniqueName: \"kubernetes.io/projected/997707ef-4296-4151-9385-0fbb48b5e317-kube-api-access-njwrd\") pod \"997707ef-4296-4151-9385-0fbb48b5e317\" (UID: \"997707ef-4296-4151-9385-0fbb48b5e317\") " Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.323595 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-operator-scripts\") pod \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\" (UID: \"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9\") " Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.323696 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/997707ef-4296-4151-9385-0fbb48b5e317-operator-scripts\") pod \"997707ef-4296-4151-9385-0fbb48b5e317\" (UID: \"997707ef-4296-4151-9385-0fbb48b5e317\") " Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.324527 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/997707ef-4296-4151-9385-0fbb48b5e317-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "997707ef-4296-4151-9385-0fbb48b5e317" (UID: "997707ef-4296-4151-9385-0fbb48b5e317"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.324549 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9" (UID: "ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.342059 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-kube-api-access-jcbwm" (OuterVolumeSpecName: "kube-api-access-jcbwm") pod "ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9" (UID: "ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9"). InnerVolumeSpecName "kube-api-access-jcbwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.342135 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/997707ef-4296-4151-9385-0fbb48b5e317-kube-api-access-njwrd" (OuterVolumeSpecName: "kube-api-access-njwrd") pod "997707ef-4296-4151-9385-0fbb48b5e317" (UID: "997707ef-4296-4151-9385-0fbb48b5e317"). InnerVolumeSpecName "kube-api-access-njwrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.426915 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcbwm\" (UniqueName: \"kubernetes.io/projected/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-kube-api-access-jcbwm\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.426950 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njwrd\" (UniqueName: \"kubernetes.io/projected/997707ef-4296-4151-9385-0fbb48b5e317-kube-api-access-njwrd\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.426962 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.426972 4721 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/997707ef-4296-4151-9385-0fbb48b5e317-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.570330 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerStarted","Data":"2e794fb911c91ba362224447e9ecd49d098fbbb2f1fb15a45aedc118537561bc"} Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.572380 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gq5tv" event={"ID":"ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9","Type":"ContainerDied","Data":"db646f49ac24673585383b65fa497f105bd4cdfb471374da88118daa7739bca0"} Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.572412 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db646f49ac24673585383b65fa497f105bd4cdfb471374da88118daa7739bca0" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.572457 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gq5tv" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.574680 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-2c26-account-create-update-2r4tb" event={"ID":"997707ef-4296-4151-9385-0fbb48b5e317","Type":"ContainerDied","Data":"30f0205f517a6c8c56e5656abaa390af0c771c20a778135ec582e019bcd70c1e"} Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.574724 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f0205f517a6c8c56e5656abaa390af0c771c20a778135ec582e019bcd70c1e" Feb 02 13:26:15 crc kubenswrapper[4721]: I0202 13:26:15.574749 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-2c26-account-create-update-2r4tb" Feb 02 13:26:16 crc kubenswrapper[4721]: I0202 13:26:16.851424 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.611828 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerStarted","Data":"4b856042309c84ea0a6e8c400ccf627a4542f6766de95577e1154ef8996d41d0"} Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.612216 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.642704 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.487876929 podStartE2EDuration="6.642673681s" podCreationTimestamp="2026-02-02 13:26:11 +0000 UTC" firstStartedPulling="2026-02-02 13:26:12.529512497 +0000 UTC m=+1512.832026886" lastFinishedPulling="2026-02-02 13:26:16.684309249 +0000 UTC m=+1516.986823638" observedRunningTime="2026-02-02 13:26:17.635704292 +0000 UTC m=+1517.938218701" watchObservedRunningTime="2026-02-02 13:26:17.642673681 +0000 UTC m=+1517.945188070" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.744915 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4rm59"] Feb 02 13:26:17 crc kubenswrapper[4721]: E0202 13:26:17.745798 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9" containerName="mariadb-database-create" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.745818 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9" containerName="mariadb-database-create" Feb 02 13:26:17 crc kubenswrapper[4721]: E0202 13:26:17.745830 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="997707ef-4296-4151-9385-0fbb48b5e317" containerName="mariadb-account-create-update" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.745837 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="997707ef-4296-4151-9385-0fbb48b5e317" containerName="mariadb-account-create-update" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.746136 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9" containerName="mariadb-database-create" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.746165 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="997707ef-4296-4151-9385-0fbb48b5e317" containerName="mariadb-account-create-update" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.747080 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.750768 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.752231 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.767877 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4rm59"] Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.889956 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.890033 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlbf6\" (UniqueName: \"kubernetes.io/projected/798dac79-94bd-4655-b409-4b173956cdbf-kube-api-access-mlbf6\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.890151 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-config-data\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.890269 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-scripts\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.959354 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.960841 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.965728 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.995048 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-scripts\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.995191 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.995242 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlbf6\" (UniqueName: \"kubernetes.io/projected/798dac79-94bd-4655-b409-4b173956cdbf-kube-api-access-mlbf6\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:17 crc kubenswrapper[4721]: I0202 13:26:17.995308 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-config-data\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.005209 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-config-data\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.010720 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-scripts\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.023275 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.027837 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.042265 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlbf6\" (UniqueName: \"kubernetes.io/projected/798dac79-94bd-4655-b409-4b173956cdbf-kube-api-access-mlbf6\") pod \"nova-cell0-cell-mapping-4rm59\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.042414 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.059940 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.068992 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.075780 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.079729 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.097742 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.097958 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8l7\" (UniqueName: \"kubernetes.io/projected/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-kube-api-access-nh8l7\") pod \"nova-cell1-novncproxy-0\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.098208 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.117660 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.119301 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.131591 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.140316 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.188765 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.191145 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.199472 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.200819 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c1a06d-626e-414f-9fa0-a09e68349ffa-logs\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.200887 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.200913 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-config-data\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.200952 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.201000 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5qfc\" (UniqueName: \"kubernetes.io/projected/16c1a06d-626e-414f-9fa0-a09e68349ffa-kube-api-access-d5qfc\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.201115 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8l7\" (UniqueName: \"kubernetes.io/projected/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-kube-api-access-nh8l7\") pod \"nova-cell1-novncproxy-0\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.201201 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.208897 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.212641 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.245437 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8l7\" (UniqueName: \"kubernetes.io/projected/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-kube-api-access-nh8l7\") pod \"nova-cell1-novncproxy-0\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.248457 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pclnt"] Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.251053 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.265150 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.289907 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.305604 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.305673 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpktt\" (UniqueName: \"kubernetes.io/projected/8715f5a7-97a1-496d-be28-13c326b54135-kube-api-access-wpktt\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.305742 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4h5x\" (UniqueName: \"kubernetes.io/projected/6deee91e-3b9b-46a0-a05e-613827b42808-kube-api-access-m4h5x\") pod \"nova-scheduler-0\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.305799 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.305853 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715f5a7-97a1-496d-be28-13c326b54135-logs\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.305946 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c1a06d-626e-414f-9fa0-a09e68349ffa-logs\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.306001 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-config-data\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.306058 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.306573 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c1a06d-626e-414f-9fa0-a09e68349ffa-logs\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.306692 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-config-data\") pod \"nova-scheduler-0\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.306755 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5qfc\" (UniqueName: \"kubernetes.io/projected/16c1a06d-626e-414f-9fa0-a09e68349ffa-kube-api-access-d5qfc\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.306785 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-config-data\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.312141 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pclnt"] Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.314295 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.319790 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-config-data\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.337662 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5qfc\" (UniqueName: \"kubernetes.io/projected/16c1a06d-626e-414f-9fa0-a09e68349ffa-kube-api-access-d5qfc\") pod \"nova-metadata-0\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.343043 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.408632 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4h5x\" (UniqueName: \"kubernetes.io/projected/6deee91e-3b9b-46a0-a05e-613827b42808-kube-api-access-m4h5x\") pod \"nova-scheduler-0\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.408687 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clh4b\" (UniqueName: \"kubernetes.io/projected/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-kube-api-access-clh4b\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.408724 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-svc\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.408774 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.408814 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715f5a7-97a1-496d-be28-13c326b54135-logs\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.408916 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-config\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.408956 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-config-data\") pod \"nova-scheduler-0\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.408993 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-config-data\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.409013 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.409042 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.409062 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.409114 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpktt\" (UniqueName: \"kubernetes.io/projected/8715f5a7-97a1-496d-be28-13c326b54135-kube-api-access-wpktt\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.409141 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.421538 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-config-data\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.421879 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715f5a7-97a1-496d-be28-13c326b54135-logs\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.427324 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.433627 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-config-data\") pod \"nova-scheduler-0\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.434829 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.451784 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4h5x\" (UniqueName: \"kubernetes.io/projected/6deee91e-3b9b-46a0-a05e-613827b42808-kube-api-access-m4h5x\") pod \"nova-scheduler-0\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.452157 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpktt\" (UniqueName: \"kubernetes.io/projected/8715f5a7-97a1-496d-be28-13c326b54135-kube-api-access-wpktt\") pod \"nova-api-0\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.513334 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-config\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.513866 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.513948 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.514001 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.514100 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clh4b\" (UniqueName: \"kubernetes.io/projected/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-kube-api-access-clh4b\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.514144 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-svc\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.516315 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.516873 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.519891 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-config\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.525867 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.534835 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-svc\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.540816 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clh4b\" (UniqueName: \"kubernetes.io/projected/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-kube-api-access-clh4b\") pod \"dnsmasq-dns-9b86998b5-pclnt\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.679734 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.699913 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.735506 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:18 crc kubenswrapper[4721]: I0202 13:26:18.895387 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4rm59"] Feb 02 13:26:18 crc kubenswrapper[4721]: W0202 13:26:18.920576 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod798dac79_94bd_4655_b409_4b173956cdbf.slice/crio-f420768dcd70c2968aba8188e47922f2bb1ac5868d1594e6d772fbe971edf203 WatchSource:0}: Error finding container f420768dcd70c2968aba8188e47922f2bb1ac5868d1594e6d772fbe971edf203: Status 404 returned error can't find the container with id f420768dcd70c2968aba8188e47922f2bb1ac5868d1594e6d772fbe971edf203 Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.181901 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 13:26:19 crc kubenswrapper[4721]: W0202 13:26:19.185354 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16c1a06d_626e_414f_9fa0_a09e68349ffa.slice/crio-4979432985debe9a9103a70638cdc02af7e7ff0e27c39cb5362f33e117dad874 WatchSource:0}: Error finding container 4979432985debe9a9103a70638cdc02af7e7ff0e27c39cb5362f33e117dad874: Status 404 returned error can't find the container with id 4979432985debe9a9103a70638cdc02af7e7ff0e27c39cb5362f33e117dad874 Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.205848 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.262458 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hclll"] Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.267605 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.278361 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.278564 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.283936 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hclll"] Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.355116 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdcgd\" (UniqueName: \"kubernetes.io/projected/6d38541e-139a-425e-a7bd-f7c484f7266b-kube-api-access-fdcgd\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.355801 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-config-data\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.356061 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-scripts\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.356926 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: W0202 13:26:19.437568 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6deee91e_3b9b_46a0_a05e_613827b42808.slice/crio-6be24bc44a924556f23ce7d2ad20d369ad32da8a65f3765a7bf2193db6ff3cfc WatchSource:0}: Error finding container 6be24bc44a924556f23ce7d2ad20d369ad32da8a65f3765a7bf2193db6ff3cfc: Status 404 returned error can't find the container with id 6be24bc44a924556f23ce7d2ad20d369ad32da8a65f3765a7bf2193db6ff3cfc Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.440627 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.459458 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-scripts\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.459710 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.459868 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdcgd\" (UniqueName: \"kubernetes.io/projected/6d38541e-139a-425e-a7bd-f7c484f7266b-kube-api-access-fdcgd\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.459970 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-config-data\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.504566 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-scripts\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.508748 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdcgd\" (UniqueName: \"kubernetes.io/projected/6d38541e-139a-425e-a7bd-f7c484f7266b-kube-api-access-fdcgd\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.512554 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-config-data\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.524760 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hclll\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.634658 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.659701 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4rm59" event={"ID":"798dac79-94bd-4655-b409-4b173956cdbf","Type":"ContainerStarted","Data":"986695a877ff6ee1eff46d199a1d9a03005ab3f7a3f83da3d9212653e0413f7c"} Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.659757 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4rm59" event={"ID":"798dac79-94bd-4655-b409-4b173956cdbf","Type":"ContainerStarted","Data":"f420768dcd70c2968aba8188e47922f2bb1ac5868d1594e6d772fbe971edf203"} Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.662642 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.666267 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16c1a06d-626e-414f-9fa0-a09e68349ffa","Type":"ContainerStarted","Data":"4979432985debe9a9103a70638cdc02af7e7ff0e27c39cb5362f33e117dad874"} Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.691657 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e","Type":"ContainerStarted","Data":"b40b44f42b58694b1e7abb93930ae837bfaf27b3b7a9cd3931ed69ef1a81d994"} Feb 02 13:26:19 crc kubenswrapper[4721]: W0202 13:26:19.696578 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb87e6edb_4947_41a9_b95c_5120f9b4dbdc.slice/crio-5099999a072997f647e2393c3c5cb7d2e07ed8a533074bc15ca483b0989cce5e WatchSource:0}: Error finding container 5099999a072997f647e2393c3c5cb7d2e07ed8a533074bc15ca483b0989cce5e: Status 404 returned error can't find the container with id 5099999a072997f647e2393c3c5cb7d2e07ed8a533074bc15ca483b0989cce5e Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.698780 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pclnt"] Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.701849 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6deee91e-3b9b-46a0-a05e-613827b42808","Type":"ContainerStarted","Data":"6be24bc44a924556f23ce7d2ad20d369ad32da8a65f3765a7bf2193db6ff3cfc"} Feb 02 13:26:19 crc kubenswrapper[4721]: I0202 13:26:19.715885 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4rm59" podStartSLOduration=2.7158637370000003 podStartE2EDuration="2.715863737s" podCreationTimestamp="2026-02-02 13:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:19.692985367 +0000 UTC m=+1519.995499756" watchObservedRunningTime="2026-02-02 13:26:19.715863737 +0000 UTC m=+1520.018378136" Feb 02 13:26:20 crc kubenswrapper[4721]: E0202 13:26:20.582666 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:20 crc kubenswrapper[4721]: I0202 13:26:20.585494 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hclll"] Feb 02 13:26:20 crc kubenswrapper[4721]: W0202 13:26:20.614654 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d38541e_139a_425e_a7bd_f7c484f7266b.slice/crio-f16e0687e4c95545bad8ad3d8adc326046486898b2e5a9a157138600b93876bb WatchSource:0}: Error finding container f16e0687e4c95545bad8ad3d8adc326046486898b2e5a9a157138600b93876bb: Status 404 returned error can't find the container with id f16e0687e4c95545bad8ad3d8adc326046486898b2e5a9a157138600b93876bb Feb 02 13:26:20 crc kubenswrapper[4721]: I0202 13:26:20.724232 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8715f5a7-97a1-496d-be28-13c326b54135","Type":"ContainerStarted","Data":"dda040b5b2dc6f664a770c8e322b217a952cd150043f7182ef1730acf2437842"} Feb 02 13:26:20 crc kubenswrapper[4721]: I0202 13:26:20.731827 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hclll" event={"ID":"6d38541e-139a-425e-a7bd-f7c484f7266b","Type":"ContainerStarted","Data":"f16e0687e4c95545bad8ad3d8adc326046486898b2e5a9a157138600b93876bb"} Feb 02 13:26:20 crc kubenswrapper[4721]: I0202 13:26:20.735346 4721 generic.go:334] "Generic (PLEG): container finished" podID="b87e6edb-4947-41a9-b95c-5120f9b4dbdc" containerID="4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c" exitCode=0 Feb 02 13:26:20 crc kubenswrapper[4721]: I0202 13:26:20.736558 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" event={"ID":"b87e6edb-4947-41a9-b95c-5120f9b4dbdc","Type":"ContainerDied","Data":"4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c"} Feb 02 13:26:20 crc kubenswrapper[4721]: I0202 13:26:20.736587 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" event={"ID":"b87e6edb-4947-41a9-b95c-5120f9b4dbdc","Type":"ContainerStarted","Data":"5099999a072997f647e2393c3c5cb7d2e07ed8a533074bc15ca483b0989cce5e"} Feb 02 13:26:21 crc kubenswrapper[4721]: I0202 13:26:21.762048 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hclll" event={"ID":"6d38541e-139a-425e-a7bd-f7c484f7266b","Type":"ContainerStarted","Data":"7eef54336f3dd0e3ab875f1a48960393b687a8f714292e0019c4853d4094a4be"} Feb 02 13:26:21 crc kubenswrapper[4721]: I0202 13:26:21.780724 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" event={"ID":"b87e6edb-4947-41a9-b95c-5120f9b4dbdc","Type":"ContainerStarted","Data":"513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d"} Feb 02 13:26:21 crc kubenswrapper[4721]: I0202 13:26:21.789943 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:21 crc kubenswrapper[4721]: I0202 13:26:21.790257 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hclll" podStartSLOduration=2.790234957 podStartE2EDuration="2.790234957s" podCreationTimestamp="2026-02-02 13:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:21.788107878 +0000 UTC m=+1522.090622287" watchObservedRunningTime="2026-02-02 13:26:21.790234957 +0000 UTC m=+1522.092749356" Feb 02 13:26:21 crc kubenswrapper[4721]: I0202 13:26:21.822488 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" podStartSLOduration=3.822468641 podStartE2EDuration="3.822468641s" podCreationTimestamp="2026-02-02 13:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:21.814060113 +0000 UTC m=+1522.116574502" watchObservedRunningTime="2026-02-02 13:26:21.822468641 +0000 UTC m=+1522.124983030" Feb 02 13:26:21 crc kubenswrapper[4721]: I0202 13:26:21.983754 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 13:26:22 crc kubenswrapper[4721]: I0202 13:26:22.001864 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:24 crc kubenswrapper[4721]: I0202 13:26:24.830467 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e","Type":"ContainerStarted","Data":"39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a"} Feb 02 13:26:24 crc kubenswrapper[4721]: I0202 13:26:24.830965 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a" gracePeriod=30 Feb 02 13:26:24 crc kubenswrapper[4721]: I0202 13:26:24.858798 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6deee91e-3b9b-46a0-a05e-613827b42808","Type":"ContainerStarted","Data":"3d9337e6ad6b91226e1d6dffcacca3ebd66fbbc0402ee3e8f5821c0fb854a95c"} Feb 02 13:26:24 crc kubenswrapper[4721]: I0202 13:26:24.876127 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8715f5a7-97a1-496d-be28-13c326b54135","Type":"ContainerStarted","Data":"e678d0355031514e26c0e876e17ad1d6766f6e13bf506ad8e8a465663e92ced1"} Feb 02 13:26:24 crc kubenswrapper[4721]: I0202 13:26:24.880465 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16c1a06d-626e-414f-9fa0-a09e68349ffa","Type":"ContainerStarted","Data":"d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449"} Feb 02 13:26:24 crc kubenswrapper[4721]: I0202 13:26:24.893250 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.758532675 podStartE2EDuration="7.893232543s" podCreationTimestamp="2026-02-02 13:26:17 +0000 UTC" firstStartedPulling="2026-02-02 13:26:19.22370166 +0000 UTC m=+1519.526216049" lastFinishedPulling="2026-02-02 13:26:24.358401528 +0000 UTC m=+1524.660915917" observedRunningTime="2026-02-02 13:26:24.85036075 +0000 UTC m=+1525.152875139" watchObservedRunningTime="2026-02-02 13:26:24.893232543 +0000 UTC m=+1525.195746932" Feb 02 13:26:24 crc kubenswrapper[4721]: I0202 13:26:24.897266 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.006580082 podStartE2EDuration="7.897250852s" podCreationTimestamp="2026-02-02 13:26:17 +0000 UTC" firstStartedPulling="2026-02-02 13:26:19.467743098 +0000 UTC m=+1519.770257487" lastFinishedPulling="2026-02-02 13:26:24.358413868 +0000 UTC m=+1524.660928257" observedRunningTime="2026-02-02 13:26:24.875872932 +0000 UTC m=+1525.178387331" watchObservedRunningTime="2026-02-02 13:26:24.897250852 +0000 UTC m=+1525.199765241" Feb 02 13:26:25 crc kubenswrapper[4721]: E0202 13:26:25.054849 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:25 crc kubenswrapper[4721]: I0202 13:26:25.893972 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8715f5a7-97a1-496d-be28-13c326b54135","Type":"ContainerStarted","Data":"187aece21bb2425b5bf5ce341d27bac916d46fddbd34aeac0734b4f84ad64e63"} Feb 02 13:26:25 crc kubenswrapper[4721]: I0202 13:26:25.895899 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16c1a06d-626e-414f-9fa0-a09e68349ffa","Type":"ContainerStarted","Data":"26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f"} Feb 02 13:26:25 crc kubenswrapper[4721]: I0202 13:26:25.896134 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerName="nova-metadata-log" containerID="cri-o://d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449" gracePeriod=30 Feb 02 13:26:25 crc kubenswrapper[4721]: I0202 13:26:25.896168 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerName="nova-metadata-metadata" containerID="cri-o://26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f" gracePeriod=30 Feb 02 13:26:25 crc kubenswrapper[4721]: I0202 13:26:25.921726 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.22140097 podStartE2EDuration="7.921706396s" podCreationTimestamp="2026-02-02 13:26:18 +0000 UTC" firstStartedPulling="2026-02-02 13:26:19.656692423 +0000 UTC m=+1519.959206812" lastFinishedPulling="2026-02-02 13:26:24.356997849 +0000 UTC m=+1524.659512238" observedRunningTime="2026-02-02 13:26:25.91337204 +0000 UTC m=+1526.215886429" watchObservedRunningTime="2026-02-02 13:26:25.921706396 +0000 UTC m=+1526.224220775" Feb 02 13:26:25 crc kubenswrapper[4721]: I0202 13:26:25.949243 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.787586723 podStartE2EDuration="8.949224692s" podCreationTimestamp="2026-02-02 13:26:17 +0000 UTC" firstStartedPulling="2026-02-02 13:26:19.217013288 +0000 UTC m=+1519.519527677" lastFinishedPulling="2026-02-02 13:26:24.378651237 +0000 UTC m=+1524.681165646" observedRunningTime="2026-02-02 13:26:25.937789652 +0000 UTC m=+1526.240304041" watchObservedRunningTime="2026-02-02 13:26:25.949224692 +0000 UTC m=+1526.251739081" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.572027 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.605706 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c1a06d-626e-414f-9fa0-a09e68349ffa-logs\") pod \"16c1a06d-626e-414f-9fa0-a09e68349ffa\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.605753 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5qfc\" (UniqueName: \"kubernetes.io/projected/16c1a06d-626e-414f-9fa0-a09e68349ffa-kube-api-access-d5qfc\") pod \"16c1a06d-626e-414f-9fa0-a09e68349ffa\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.605808 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-combined-ca-bundle\") pod \"16c1a06d-626e-414f-9fa0-a09e68349ffa\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.605876 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-config-data\") pod \"16c1a06d-626e-414f-9fa0-a09e68349ffa\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.607660 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c1a06d-626e-414f-9fa0-a09e68349ffa-logs" (OuterVolumeSpecName: "logs") pod "16c1a06d-626e-414f-9fa0-a09e68349ffa" (UID: "16c1a06d-626e-414f-9fa0-a09e68349ffa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.616960 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c1a06d-626e-414f-9fa0-a09e68349ffa-kube-api-access-d5qfc" (OuterVolumeSpecName: "kube-api-access-d5qfc") pod "16c1a06d-626e-414f-9fa0-a09e68349ffa" (UID: "16c1a06d-626e-414f-9fa0-a09e68349ffa"). InnerVolumeSpecName "kube-api-access-d5qfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:26 crc kubenswrapper[4721]: E0202 13:26:26.644362 4721 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-config-data podName:16c1a06d-626e-414f-9fa0-a09e68349ffa nodeName:}" failed. No retries permitted until 2026-02-02 13:26:27.144334045 +0000 UTC m=+1527.446848434 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-config-data") pod "16c1a06d-626e-414f-9fa0-a09e68349ffa" (UID: "16c1a06d-626e-414f-9fa0-a09e68349ffa") : error deleting /var/lib/kubelet/pods/16c1a06d-626e-414f-9fa0-a09e68349ffa/volume-subpaths: remove /var/lib/kubelet/pods/16c1a06d-626e-414f-9fa0-a09e68349ffa/volume-subpaths: no such file or directory Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.647365 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16c1a06d-626e-414f-9fa0-a09e68349ffa" (UID: "16c1a06d-626e-414f-9fa0-a09e68349ffa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.709608 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.709652 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16c1a06d-626e-414f-9fa0-a09e68349ffa-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.709671 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5qfc\" (UniqueName: \"kubernetes.io/projected/16c1a06d-626e-414f-9fa0-a09e68349ffa-kube-api-access-d5qfc\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.910092 4721 generic.go:334] "Generic (PLEG): container finished" podID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerID="26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f" exitCode=0 Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.910127 4721 generic.go:334] "Generic (PLEG): container finished" podID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerID="d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449" exitCode=143 Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.910177 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.910807 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16c1a06d-626e-414f-9fa0-a09e68349ffa","Type":"ContainerDied","Data":"26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f"} Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.910860 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16c1a06d-626e-414f-9fa0-a09e68349ffa","Type":"ContainerDied","Data":"d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449"} Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.910874 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"16c1a06d-626e-414f-9fa0-a09e68349ffa","Type":"ContainerDied","Data":"4979432985debe9a9103a70638cdc02af7e7ff0e27c39cb5362f33e117dad874"} Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.910892 4721 scope.go:117] "RemoveContainer" containerID="26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.952790 4721 scope.go:117] "RemoveContainer" containerID="d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.982013 4721 scope.go:117] "RemoveContainer" containerID="26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f" Feb 02 13:26:26 crc kubenswrapper[4721]: E0202 13:26:26.982641 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f\": container with ID starting with 26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f not found: ID does not exist" containerID="26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.982676 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f"} err="failed to get container status \"26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f\": rpc error: code = NotFound desc = could not find container \"26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f\": container with ID starting with 26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f not found: ID does not exist" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.982703 4721 scope.go:117] "RemoveContainer" containerID="d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449" Feb 02 13:26:26 crc kubenswrapper[4721]: E0202 13:26:26.983147 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449\": container with ID starting with d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449 not found: ID does not exist" containerID="d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.983169 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449"} err="failed to get container status \"d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449\": rpc error: code = NotFound desc = could not find container \"d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449\": container with ID starting with d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449 not found: ID does not exist" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.983181 4721 scope.go:117] "RemoveContainer" containerID="26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.983556 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f"} err="failed to get container status \"26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f\": rpc error: code = NotFound desc = could not find container \"26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f\": container with ID starting with 26bfcaea73d1b0a0e727ff022ecb290ea5ebc0e49635be31a814dbc20238e47f not found: ID does not exist" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.983574 4721 scope.go:117] "RemoveContainer" containerID="d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449" Feb 02 13:26:26 crc kubenswrapper[4721]: I0202 13:26:26.983862 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449"} err="failed to get container status \"d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449\": rpc error: code = NotFound desc = could not find container \"d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449\": container with ID starting with d2a55d0b43fdef668c43d93ef586abd1a401dd6f4301e561f85ee5cbf104d449 not found: ID does not exist" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.219201 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-config-data\") pod \"16c1a06d-626e-414f-9fa0-a09e68349ffa\" (UID: \"16c1a06d-626e-414f-9fa0-a09e68349ffa\") " Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.223390 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-config-data" (OuterVolumeSpecName: "config-data") pod "16c1a06d-626e-414f-9fa0-a09e68349ffa" (UID: "16c1a06d-626e-414f-9fa0-a09e68349ffa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.337327 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16c1a06d-626e-414f-9fa0-a09e68349ffa-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.554346 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.571595 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.589362 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:27 crc kubenswrapper[4721]: E0202 13:26:27.589893 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerName="nova-metadata-log" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.589912 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerName="nova-metadata-log" Feb 02 13:26:27 crc kubenswrapper[4721]: E0202 13:26:27.589956 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerName="nova-metadata-metadata" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.589963 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerName="nova-metadata-metadata" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.590191 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerName="nova-metadata-metadata" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.590212 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c1a06d-626e-414f-9fa0-a09e68349ffa" containerName="nova-metadata-log" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.591454 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.594695 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.597712 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.646837 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.649970 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.650193 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-logs\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.650302 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-config-data\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.650499 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxlz\" (UniqueName: \"kubernetes.io/projected/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-kube-api-access-vhxlz\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.650677 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.752021 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxlz\" (UniqueName: \"kubernetes.io/projected/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-kube-api-access-vhxlz\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.752151 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.752249 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.752286 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-logs\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.752322 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-config-data\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.753644 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-logs\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.758091 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.758695 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-config-data\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.759091 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.773271 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxlz\" (UniqueName: \"kubernetes.io/projected/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-kube-api-access-vhxlz\") pod \"nova-metadata-0\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " pod="openstack/nova-metadata-0" Feb 02 13:26:27 crc kubenswrapper[4721]: I0202 13:26:27.924312 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.291126 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.432714 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16c1a06d-626e-414f-9fa0-a09e68349ffa" path="/var/lib/kubelet/pods/16c1a06d-626e-414f-9fa0-a09e68349ffa/volumes" Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.434098 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:28 crc kubenswrapper[4721]: W0202 13:26:28.442143 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ae1ad9_237c_4b8f_acdf_ba750cc6316e.slice/crio-401c3ded565d56700cb511af03efc19368dafbf26ff256c1b27d75fad4eec09d WatchSource:0}: Error finding container 401c3ded565d56700cb511af03efc19368dafbf26ff256c1b27d75fad4eec09d: Status 404 returned error can't find the container with id 401c3ded565d56700cb511af03efc19368dafbf26ff256c1b27d75fad4eec09d Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.681344 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.681421 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.700607 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.700663 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.716605 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.742291 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.844544 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-m87tn"] Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.845150 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" podUID="a217ca40-3638-474b-b739-cb8784823fa6" containerName="dnsmasq-dns" containerID="cri-o://02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088" gracePeriod=10 Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.943668 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07ae1ad9-237c-4b8f-acdf-ba750cc6316e","Type":"ContainerStarted","Data":"7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d"} Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.945234 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07ae1ad9-237c-4b8f-acdf-ba750cc6316e","Type":"ContainerStarted","Data":"401c3ded565d56700cb511af03efc19368dafbf26ff256c1b27d75fad4eec09d"} Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.947193 4721 generic.go:334] "Generic (PLEG): container finished" podID="798dac79-94bd-4655-b409-4b173956cdbf" containerID="986695a877ff6ee1eff46d199a1d9a03005ab3f7a3f83da3d9212653e0413f7c" exitCode=0 Feb 02 13:26:28 crc kubenswrapper[4721]: I0202 13:26:28.947429 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4rm59" event={"ID":"798dac79-94bd-4655-b409-4b173956cdbf","Type":"ContainerDied","Data":"986695a877ff6ee1eff46d199a1d9a03005ab3f7a3f83da3d9212653e0413f7c"} Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.011746 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.553988 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.599242 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-config\") pod \"a217ca40-3638-474b-b739-cb8784823fa6\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.599348 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-sb\") pod \"a217ca40-3638-474b-b739-cb8784823fa6\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.599512 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crj69\" (UniqueName: \"kubernetes.io/projected/a217ca40-3638-474b-b739-cb8784823fa6-kube-api-access-crj69\") pod \"a217ca40-3638-474b-b739-cb8784823fa6\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.600106 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-nb\") pod \"a217ca40-3638-474b-b739-cb8784823fa6\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.600167 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-swift-storage-0\") pod \"a217ca40-3638-474b-b739-cb8784823fa6\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.600528 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-svc\") pod \"a217ca40-3638-474b-b739-cb8784823fa6\" (UID: \"a217ca40-3638-474b-b739-cb8784823fa6\") " Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.616807 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a217ca40-3638-474b-b739-cb8784823fa6-kube-api-access-crj69" (OuterVolumeSpecName: "kube-api-access-crj69") pod "a217ca40-3638-474b-b739-cb8784823fa6" (UID: "a217ca40-3638-474b-b739-cb8784823fa6"). InnerVolumeSpecName "kube-api-access-crj69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.684976 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-config" (OuterVolumeSpecName: "config") pod "a217ca40-3638-474b-b739-cb8784823fa6" (UID: "a217ca40-3638-474b-b739-cb8784823fa6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.701807 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a217ca40-3638-474b-b739-cb8784823fa6" (UID: "a217ca40-3638-474b-b739-cb8784823fa6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.702105 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a217ca40-3638-474b-b739-cb8784823fa6" (UID: "a217ca40-3638-474b-b739-cb8784823fa6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.703977 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.704011 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.704026 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.704038 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crj69\" (UniqueName: \"kubernetes.io/projected/a217ca40-3638-474b-b739-cb8784823fa6-kube-api-access-crj69\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.728225 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a217ca40-3638-474b-b739-cb8784823fa6" (UID: "a217ca40-3638-474b-b739-cb8784823fa6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.749201 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a217ca40-3638-474b-b739-cb8784823fa6" (UID: "a217ca40-3638-474b-b739-cb8784823fa6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.783344 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.249:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.783682 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.249:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.818428 4721 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.818467 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a217ca40-3638-474b-b739-cb8784823fa6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.967343 4721 generic.go:334] "Generic (PLEG): container finished" podID="6d38541e-139a-425e-a7bd-f7c484f7266b" containerID="7eef54336f3dd0e3ab875f1a48960393b687a8f714292e0019c4853d4094a4be" exitCode=0 Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.967408 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hclll" event={"ID":"6d38541e-139a-425e-a7bd-f7c484f7266b","Type":"ContainerDied","Data":"7eef54336f3dd0e3ab875f1a48960393b687a8f714292e0019c4853d4094a4be"} Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.971542 4721 generic.go:334] "Generic (PLEG): container finished" podID="a217ca40-3638-474b-b739-cb8784823fa6" containerID="02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088" exitCode=0 Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.971620 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" event={"ID":"a217ca40-3638-474b-b739-cb8784823fa6","Type":"ContainerDied","Data":"02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088"} Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.971649 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" event={"ID":"a217ca40-3638-474b-b739-cb8784823fa6","Type":"ContainerDied","Data":"855aab218dced6a9a20cd36dee1f3e920c647b6da54cc409503c35b4f9458f8e"} Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.971646 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.971668 4721 scope.go:117] "RemoveContainer" containerID="02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088" Feb 02 13:26:29 crc kubenswrapper[4721]: I0202 13:26:29.980651 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07ae1ad9-237c-4b8f-acdf-ba750cc6316e","Type":"ContainerStarted","Data":"0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4"} Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.011521 4721 scope.go:117] "RemoveContainer" containerID="bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.027322 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.027299023 podStartE2EDuration="3.027299023s" podCreationTimestamp="2026-02-02 13:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:30.018482864 +0000 UTC m=+1530.320997253" watchObservedRunningTime="2026-02-02 13:26:30.027299023 +0000 UTC m=+1530.329813412" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.056222 4721 scope.go:117] "RemoveContainer" containerID="02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088" Feb 02 13:26:30 crc kubenswrapper[4721]: E0202 13:26:30.060211 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088\": container with ID starting with 02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088 not found: ID does not exist" containerID="02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.060263 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088"} err="failed to get container status \"02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088\": rpc error: code = NotFound desc = could not find container \"02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088\": container with ID starting with 02eb0b572a9d7db26d91ed79b8f1f7e0e781ae5e0ac90b394b51265b1c190088 not found: ID does not exist" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.060297 4721 scope.go:117] "RemoveContainer" containerID="bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed" Feb 02 13:26:30 crc kubenswrapper[4721]: E0202 13:26:30.064148 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed\": container with ID starting with bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed not found: ID does not exist" containerID="bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.064181 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed"} err="failed to get container status \"bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed\": rpc error: code = NotFound desc = could not find container \"bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed\": container with ID starting with bfd05ac10c25cef470159320ac73442924479047b666e7ec3809106d9b26e0ed not found: ID does not exist" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.070124 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-m87tn"] Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.079938 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-m87tn"] Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.433818 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a217ca40-3638-474b-b739-cb8784823fa6" path="/var/lib/kubelet/pods/a217ca40-3638-474b-b739-cb8784823fa6/volumes" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.542322 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.639497 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-scripts\") pod \"798dac79-94bd-4655-b409-4b173956cdbf\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.639741 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlbf6\" (UniqueName: \"kubernetes.io/projected/798dac79-94bd-4655-b409-4b173956cdbf-kube-api-access-mlbf6\") pod \"798dac79-94bd-4655-b409-4b173956cdbf\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.639791 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-combined-ca-bundle\") pod \"798dac79-94bd-4655-b409-4b173956cdbf\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.639836 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-config-data\") pod \"798dac79-94bd-4655-b409-4b173956cdbf\" (UID: \"798dac79-94bd-4655-b409-4b173956cdbf\") " Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.663647 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798dac79-94bd-4655-b409-4b173956cdbf-kube-api-access-mlbf6" (OuterVolumeSpecName: "kube-api-access-mlbf6") pod "798dac79-94bd-4655-b409-4b173956cdbf" (UID: "798dac79-94bd-4655-b409-4b173956cdbf"). InnerVolumeSpecName "kube-api-access-mlbf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.665198 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-scripts" (OuterVolumeSpecName: "scripts") pod "798dac79-94bd-4655-b409-4b173956cdbf" (UID: "798dac79-94bd-4655-b409-4b173956cdbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.681319 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-config-data" (OuterVolumeSpecName: "config-data") pod "798dac79-94bd-4655-b409-4b173956cdbf" (UID: "798dac79-94bd-4655-b409-4b173956cdbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.687173 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "798dac79-94bd-4655-b409-4b173956cdbf" (UID: "798dac79-94bd-4655-b409-4b173956cdbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.742541 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlbf6\" (UniqueName: \"kubernetes.io/projected/798dac79-94bd-4655-b409-4b173956cdbf-kube-api-access-mlbf6\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.742568 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.742577 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:30 crc kubenswrapper[4721]: I0202 13:26:30.742585 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/798dac79-94bd-4655-b409-4b173956cdbf-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.007781 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4rm59" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.007924 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4rm59" event={"ID":"798dac79-94bd-4655-b409-4b173956cdbf","Type":"ContainerDied","Data":"f420768dcd70c2968aba8188e47922f2bb1ac5868d1594e6d772fbe971edf203"} Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.007978 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f420768dcd70c2968aba8188e47922f2bb1ac5868d1594e6d772fbe971edf203" Feb 02 13:26:31 crc kubenswrapper[4721]: E0202 13:26:31.024801 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.190613 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.191200 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-log" containerID="cri-o://e678d0355031514e26c0e876e17ad1d6766f6e13bf506ad8e8a465663e92ced1" gracePeriod=30 Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.191722 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-api" containerID="cri-o://187aece21bb2425b5bf5ce341d27bac916d46fddbd34aeac0734b4f84ad64e63" gracePeriod=30 Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.221238 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.221413 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6deee91e-3b9b-46a0-a05e-613827b42808" containerName="nova-scheduler-scheduler" containerID="cri-o://3d9337e6ad6b91226e1d6dffcacca3ebd66fbbc0402ee3e8f5821c0fb854a95c" gracePeriod=30 Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.260259 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.666850 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.681317 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-combined-ca-bundle\") pod \"6d38541e-139a-425e-a7bd-f7c484f7266b\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.681530 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-config-data\") pod \"6d38541e-139a-425e-a7bd-f7c484f7266b\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.681778 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-scripts\") pod \"6d38541e-139a-425e-a7bd-f7c484f7266b\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.681833 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdcgd\" (UniqueName: \"kubernetes.io/projected/6d38541e-139a-425e-a7bd-f7c484f7266b-kube-api-access-fdcgd\") pod \"6d38541e-139a-425e-a7bd-f7c484f7266b\" (UID: \"6d38541e-139a-425e-a7bd-f7c484f7266b\") " Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.696251 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-scripts" (OuterVolumeSpecName: "scripts") pod "6d38541e-139a-425e-a7bd-f7c484f7266b" (UID: "6d38541e-139a-425e-a7bd-f7c484f7266b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.710282 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d38541e-139a-425e-a7bd-f7c484f7266b-kube-api-access-fdcgd" (OuterVolumeSpecName: "kube-api-access-fdcgd") pod "6d38541e-139a-425e-a7bd-f7c484f7266b" (UID: "6d38541e-139a-425e-a7bd-f7c484f7266b"). InnerVolumeSpecName "kube-api-access-fdcgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.776231 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-config-data" (OuterVolumeSpecName: "config-data") pod "6d38541e-139a-425e-a7bd-f7c484f7266b" (UID: "6d38541e-139a-425e-a7bd-f7c484f7266b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.784339 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.784371 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.784384 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdcgd\" (UniqueName: \"kubernetes.io/projected/6d38541e-139a-425e-a7bd-f7c484f7266b-kube-api-access-fdcgd\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.802203 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d38541e-139a-425e-a7bd-f7c484f7266b" (UID: "6d38541e-139a-425e-a7bd-f7c484f7266b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:31 crc kubenswrapper[4721]: I0202 13:26:31.887355 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d38541e-139a-425e-a7bd-f7c484f7266b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.039261 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hclll" event={"ID":"6d38541e-139a-425e-a7bd-f7c484f7266b","Type":"ContainerDied","Data":"f16e0687e4c95545bad8ad3d8adc326046486898b2e5a9a157138600b93876bb"} Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.039297 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hclll" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.039307 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f16e0687e4c95545bad8ad3d8adc326046486898b2e5a9a157138600b93876bb" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.043195 4721 generic.go:334] "Generic (PLEG): container finished" podID="8715f5a7-97a1-496d-be28-13c326b54135" containerID="e678d0355031514e26c0e876e17ad1d6766f6e13bf506ad8e8a465663e92ced1" exitCode=143 Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.043306 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8715f5a7-97a1-496d-be28-13c326b54135","Type":"ContainerDied","Data":"e678d0355031514e26c0e876e17ad1d6766f6e13bf506ad8e8a465663e92ced1"} Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.043406 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerName="nova-metadata-log" containerID="cri-o://7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d" gracePeriod=30 Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.043492 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerName="nova-metadata-metadata" containerID="cri-o://0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4" gracePeriod=30 Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.093305 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 13:26:32 crc kubenswrapper[4721]: E0202 13:26:32.093897 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a217ca40-3638-474b-b739-cb8784823fa6" containerName="dnsmasq-dns" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.093914 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a217ca40-3638-474b-b739-cb8784823fa6" containerName="dnsmasq-dns" Feb 02 13:26:32 crc kubenswrapper[4721]: E0202 13:26:32.093936 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d38541e-139a-425e-a7bd-f7c484f7266b" containerName="nova-cell1-conductor-db-sync" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.093942 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d38541e-139a-425e-a7bd-f7c484f7266b" containerName="nova-cell1-conductor-db-sync" Feb 02 13:26:32 crc kubenswrapper[4721]: E0202 13:26:32.093951 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798dac79-94bd-4655-b409-4b173956cdbf" containerName="nova-manage" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.093958 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="798dac79-94bd-4655-b409-4b173956cdbf" containerName="nova-manage" Feb 02 13:26:32 crc kubenswrapper[4721]: E0202 13:26:32.093998 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a217ca40-3638-474b-b739-cb8784823fa6" containerName="init" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.094005 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a217ca40-3638-474b-b739-cb8784823fa6" containerName="init" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.094222 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a217ca40-3638-474b-b739-cb8784823fa6" containerName="dnsmasq-dns" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.094235 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="798dac79-94bd-4655-b409-4b173956cdbf" containerName="nova-manage" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.094246 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d38541e-139a-425e-a7bd-f7c484f7266b" containerName="nova-cell1-conductor-db-sync" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.095055 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.103268 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.113281 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.197536 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np8xt\" (UniqueName: \"kubernetes.io/projected/3830e692-ad9d-48c7-800f-dc63cadb2376-kube-api-access-np8xt\") pod \"nova-cell1-conductor-0\" (UID: \"3830e692-ad9d-48c7-800f-dc63cadb2376\") " pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.197778 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3830e692-ad9d-48c7-800f-dc63cadb2376-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3830e692-ad9d-48c7-800f-dc63cadb2376\") " pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.198030 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3830e692-ad9d-48c7-800f-dc63cadb2376-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3830e692-ad9d-48c7-800f-dc63cadb2376\") " pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.299404 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np8xt\" (UniqueName: \"kubernetes.io/projected/3830e692-ad9d-48c7-800f-dc63cadb2376-kube-api-access-np8xt\") pod \"nova-cell1-conductor-0\" (UID: \"3830e692-ad9d-48c7-800f-dc63cadb2376\") " pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.299777 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3830e692-ad9d-48c7-800f-dc63cadb2376-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3830e692-ad9d-48c7-800f-dc63cadb2376\") " pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.299848 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3830e692-ad9d-48c7-800f-dc63cadb2376-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3830e692-ad9d-48c7-800f-dc63cadb2376\") " pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.306173 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3830e692-ad9d-48c7-800f-dc63cadb2376-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3830e692-ad9d-48c7-800f-dc63cadb2376\") " pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.306242 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3830e692-ad9d-48c7-800f-dc63cadb2376-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3830e692-ad9d-48c7-800f-dc63cadb2376\") " pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.326645 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np8xt\" (UniqueName: \"kubernetes.io/projected/3830e692-ad9d-48c7-800f-dc63cadb2376-kube-api-access-np8xt\") pod \"nova-cell1-conductor-0\" (UID: \"3830e692-ad9d-48c7-800f-dc63cadb2376\") " pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.435821 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.701628 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.830045 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-config-data\") pod \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.830394 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-logs\") pod \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.830440 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-combined-ca-bundle\") pod \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.830539 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-nova-metadata-tls-certs\") pod \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.830566 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhxlz\" (UniqueName: \"kubernetes.io/projected/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-kube-api-access-vhxlz\") pod \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\" (UID: \"07ae1ad9-237c-4b8f-acdf-ba750cc6316e\") " Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.833165 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-logs" (OuterVolumeSpecName: "logs") pod "07ae1ad9-237c-4b8f-acdf-ba750cc6316e" (UID: "07ae1ad9-237c-4b8f-acdf-ba750cc6316e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.839268 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-kube-api-access-vhxlz" (OuterVolumeSpecName: "kube-api-access-vhxlz") pod "07ae1ad9-237c-4b8f-acdf-ba750cc6316e" (UID: "07ae1ad9-237c-4b8f-acdf-ba750cc6316e"). InnerVolumeSpecName "kube-api-access-vhxlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.876432 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07ae1ad9-237c-4b8f-acdf-ba750cc6316e" (UID: "07ae1ad9-237c-4b8f-acdf-ba750cc6316e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.911758 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-config-data" (OuterVolumeSpecName: "config-data") pod "07ae1ad9-237c-4b8f-acdf-ba750cc6316e" (UID: "07ae1ad9-237c-4b8f-acdf-ba750cc6316e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.934892 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.935201 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.935326 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.935233 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "07ae1ad9-237c-4b8f-acdf-ba750cc6316e" (UID: "07ae1ad9-237c-4b8f-acdf-ba750cc6316e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:32 crc kubenswrapper[4721]: I0202 13:26:32.935408 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhxlz\" (UniqueName: \"kubernetes.io/projected/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-kube-api-access-vhxlz\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.038317 4721 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07ae1ad9-237c-4b8f-acdf-ba750cc6316e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.092671 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.093419 4721 generic.go:334] "Generic (PLEG): container finished" podID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerID="0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4" exitCode=0 Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.093447 4721 generic.go:334] "Generic (PLEG): container finished" podID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerID="7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d" exitCode=143 Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.093579 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.098550 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07ae1ad9-237c-4b8f-acdf-ba750cc6316e","Type":"ContainerDied","Data":"0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4"} Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.098699 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07ae1ad9-237c-4b8f-acdf-ba750cc6316e","Type":"ContainerDied","Data":"7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d"} Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.098718 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07ae1ad9-237c-4b8f-acdf-ba750cc6316e","Type":"ContainerDied","Data":"401c3ded565d56700cb511af03efc19368dafbf26ff256c1b27d75fad4eec09d"} Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.098752 4721 scope.go:117] "RemoveContainer" containerID="0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.100121 4721 generic.go:334] "Generic (PLEG): container finished" podID="6deee91e-3b9b-46a0-a05e-613827b42808" containerID="3d9337e6ad6b91226e1d6dffcacca3ebd66fbbc0402ee3e8f5821c0fb854a95c" exitCode=0 Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.100151 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6deee91e-3b9b-46a0-a05e-613827b42808","Type":"ContainerDied","Data":"3d9337e6ad6b91226e1d6dffcacca3ebd66fbbc0402ee3e8f5821c0fb854a95c"} Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.209568 4721 scope.go:117] "RemoveContainer" containerID="7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.211968 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.225207 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.242751 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:33 crc kubenswrapper[4721]: E0202 13:26:33.243465 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerName="nova-metadata-log" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.243492 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerName="nova-metadata-log" Feb 02 13:26:33 crc kubenswrapper[4721]: E0202 13:26:33.243514 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerName="nova-metadata-metadata" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.243523 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerName="nova-metadata-metadata" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.243864 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerName="nova-metadata-log" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.243894 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" containerName="nova-metadata-metadata" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.245535 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.246608 4721 scope.go:117] "RemoveContainer" containerID="0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4" Feb 02 13:26:33 crc kubenswrapper[4721]: E0202 13:26:33.247080 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4\": container with ID starting with 0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4 not found: ID does not exist" containerID="0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.247113 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4"} err="failed to get container status \"0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4\": rpc error: code = NotFound desc = could not find container \"0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4\": container with ID starting with 0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4 not found: ID does not exist" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.247147 4721 scope.go:117] "RemoveContainer" containerID="7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.248906 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 13:26:33 crc kubenswrapper[4721]: E0202 13:26:33.248933 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d\": container with ID starting with 7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d not found: ID does not exist" containerID="7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.248954 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d"} err="failed to get container status \"7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d\": rpc error: code = NotFound desc = could not find container \"7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d\": container with ID starting with 7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d not found: ID does not exist" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.248971 4721 scope.go:117] "RemoveContainer" containerID="0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.249017 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.249438 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4"} err="failed to get container status \"0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4\": rpc error: code = NotFound desc = could not find container \"0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4\": container with ID starting with 0dd31ac8d5e5c820894217a072d1a44f08ac4334e8888db92c06bac98ee2a2c4 not found: ID does not exist" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.249482 4721 scope.go:117] "RemoveContainer" containerID="7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.249980 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d"} err="failed to get container status \"7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d\": rpc error: code = NotFound desc = could not find container \"7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d\": container with ID starting with 7c2a7168cf5c08335a6fc6fc28321f7bb3209bb8dc3bb8f3d305f70c6c81ce1d not found: ID does not exist" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.265900 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.308029 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.348558 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-combined-ca-bundle\") pod \"6deee91e-3b9b-46a0-a05e-613827b42808\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.348672 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-config-data\") pod \"6deee91e-3b9b-46a0-a05e-613827b42808\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.348696 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4h5x\" (UniqueName: \"kubernetes.io/projected/6deee91e-3b9b-46a0-a05e-613827b42808-kube-api-access-m4h5x\") pod \"6deee91e-3b9b-46a0-a05e-613827b42808\" (UID: \"6deee91e-3b9b-46a0-a05e-613827b42808\") " Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.348994 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22jm\" (UniqueName: \"kubernetes.io/projected/fbfc7ae7-2e8c-4696-a72e-7308794bf726-kube-api-access-t22jm\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.349101 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfc7ae7-2e8c-4696-a72e-7308794bf726-logs\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.349138 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.349281 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-config-data\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.349396 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.361282 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6deee91e-3b9b-46a0-a05e-613827b42808-kube-api-access-m4h5x" (OuterVolumeSpecName: "kube-api-access-m4h5x") pod "6deee91e-3b9b-46a0-a05e-613827b42808" (UID: "6deee91e-3b9b-46a0-a05e-613827b42808"). InnerVolumeSpecName "kube-api-access-m4h5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.402234 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-config-data" (OuterVolumeSpecName: "config-data") pod "6deee91e-3b9b-46a0-a05e-613827b42808" (UID: "6deee91e-3b9b-46a0-a05e-613827b42808"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.412241 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6deee91e-3b9b-46a0-a05e-613827b42808" (UID: "6deee91e-3b9b-46a0-a05e-613827b42808"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.450722 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.450799 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22jm\" (UniqueName: \"kubernetes.io/projected/fbfc7ae7-2e8c-4696-a72e-7308794bf726-kube-api-access-t22jm\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.450908 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfc7ae7-2e8c-4696-a72e-7308794bf726-logs\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.450954 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.451157 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-config-data\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.451268 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.451283 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6deee91e-3b9b-46a0-a05e-613827b42808-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.451295 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4h5x\" (UniqueName: \"kubernetes.io/projected/6deee91e-3b9b-46a0-a05e-613827b42808-kube-api-access-m4h5x\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.451890 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfc7ae7-2e8c-4696-a72e-7308794bf726-logs\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.454359 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.455297 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-config-data\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.455732 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.469930 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22jm\" (UniqueName: \"kubernetes.io/projected/fbfc7ae7-2e8c-4696-a72e-7308794bf726-kube-api-access-t22jm\") pod \"nova-metadata-0\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " pod="openstack/nova-metadata-0" Feb 02 13:26:33 crc kubenswrapper[4721]: I0202 13:26:33.562494 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.123280 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3830e692-ad9d-48c7-800f-dc63cadb2376","Type":"ContainerStarted","Data":"fc2c5eca83ec18e057bf45edfa8caefc16dff1bab315e2b7db26854a8f90f3b4"} Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.123659 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3830e692-ad9d-48c7-800f-dc63cadb2376","Type":"ContainerStarted","Data":"2881a9ef319197e2ad270396cac1ac972ea7ca641749e280f5adf40c46e58733"} Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.124183 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.127460 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6deee91e-3b9b-46a0-a05e-613827b42808","Type":"ContainerDied","Data":"6be24bc44a924556f23ce7d2ad20d369ad32da8a65f3765a7bf2193db6ff3cfc"} Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.127513 4721 scope.go:117] "RemoveContainer" containerID="3d9337e6ad6b91226e1d6dffcacca3ebd66fbbc0402ee3e8f5821c0fb854a95c" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.127645 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.136413 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.136448 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7756b9d78c-m87tn" podUID="a217ca40-3638-474b-b739-cb8784823fa6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.219:5353: i/o timeout" Feb 02 13:26:34 crc kubenswrapper[4721]: W0202 13:26:34.142629 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbfc7ae7_2e8c_4696_a72e_7308794bf726.slice/crio-47ce14db062af36ae7fdbf425398b0057cb5127414ac2231235ece255a18295a WatchSource:0}: Error finding container 47ce14db062af36ae7fdbf425398b0057cb5127414ac2231235ece255a18295a: Status 404 returned error can't find the container with id 47ce14db062af36ae7fdbf425398b0057cb5127414ac2231235ece255a18295a Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.154663 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.154644651 podStartE2EDuration="2.154644651s" podCreationTimestamp="2026-02-02 13:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:34.148499553 +0000 UTC m=+1534.451013962" watchObservedRunningTime="2026-02-02 13:26:34.154644651 +0000 UTC m=+1534.457159040" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.227636 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.250573 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.268594 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:26:34 crc kubenswrapper[4721]: E0202 13:26:34.269145 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6deee91e-3b9b-46a0-a05e-613827b42808" containerName="nova-scheduler-scheduler" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.269164 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6deee91e-3b9b-46a0-a05e-613827b42808" containerName="nova-scheduler-scheduler" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.269377 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="6deee91e-3b9b-46a0-a05e-613827b42808" containerName="nova-scheduler-scheduler" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.270231 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.273311 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.280304 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.280356 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb5wd\" (UniqueName: \"kubernetes.io/projected/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-kube-api-access-zb5wd\") pod \"nova-scheduler-0\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.280424 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-config-data\") pod \"nova-scheduler-0\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.281958 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.390097 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.390430 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb5wd\" (UniqueName: \"kubernetes.io/projected/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-kube-api-access-zb5wd\") pod \"nova-scheduler-0\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.390613 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-config-data\") pod \"nova-scheduler-0\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.397208 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-config-data\") pod \"nova-scheduler-0\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.407959 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.410003 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb5wd\" (UniqueName: \"kubernetes.io/projected/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-kube-api-access-zb5wd\") pod \"nova-scheduler-0\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " pod="openstack/nova-scheduler-0" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.428269 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ae1ad9-237c-4b8f-acdf-ba750cc6316e" path="/var/lib/kubelet/pods/07ae1ad9-237c-4b8f-acdf-ba750cc6316e/volumes" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.429185 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6deee91e-3b9b-46a0-a05e-613827b42808" path="/var/lib/kubelet/pods/6deee91e-3b9b-46a0-a05e-613827b42808/volumes" Feb 02 13:26:34 crc kubenswrapper[4721]: I0202 13:26:34.637245 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:26:35 crc kubenswrapper[4721]: I0202 13:26:35.148778 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfc7ae7-2e8c-4696-a72e-7308794bf726","Type":"ContainerStarted","Data":"74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203"} Feb 02 13:26:35 crc kubenswrapper[4721]: I0202 13:26:35.149504 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfc7ae7-2e8c-4696-a72e-7308794bf726","Type":"ContainerStarted","Data":"4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229"} Feb 02 13:26:35 crc kubenswrapper[4721]: I0202 13:26:35.149522 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfc7ae7-2e8c-4696-a72e-7308794bf726","Type":"ContainerStarted","Data":"47ce14db062af36ae7fdbf425398b0057cb5127414ac2231235ece255a18295a"} Feb 02 13:26:35 crc kubenswrapper[4721]: W0202 13:26:35.187522 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f591d46_b7ce_4767_987a_bcdaa2f6d3b1.slice/crio-069121de18c3f33f1304829d8169978a460562815bd16049d235be352ef578bb WatchSource:0}: Error finding container 069121de18c3f33f1304829d8169978a460562815bd16049d235be352ef578bb: Status 404 returned error can't find the container with id 069121de18c3f33f1304829d8169978a460562815bd16049d235be352ef578bb Feb 02 13:26:35 crc kubenswrapper[4721]: I0202 13:26:35.196528 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.196501347 podStartE2EDuration="2.196501347s" podCreationTimestamp="2026-02-02 13:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:35.166305137 +0000 UTC m=+1535.468819526" watchObservedRunningTime="2026-02-02 13:26:35.196501347 +0000 UTC m=+1535.499015736" Feb 02 13:26:35 crc kubenswrapper[4721]: I0202 13:26:35.213156 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:26:36 crc kubenswrapper[4721]: I0202 13:26:36.164254 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1","Type":"ContainerStarted","Data":"5acb431b8e0eb7677b98d89ff66f186391a5d39e1b675d018fe41d3b90c905cd"} Feb 02 13:26:36 crc kubenswrapper[4721]: I0202 13:26:36.164499 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1","Type":"ContainerStarted","Data":"069121de18c3f33f1304829d8169978a460562815bd16049d235be352ef578bb"} Feb 02 13:26:36 crc kubenswrapper[4721]: I0202 13:26:36.194508 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.194485063 podStartE2EDuration="2.194485063s" podCreationTimestamp="2026-02-02 13:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:36.18554655 +0000 UTC m=+1536.488060939" watchObservedRunningTime="2026-02-02 13:26:36.194485063 +0000 UTC m=+1536.496999472" Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.178537 4721 generic.go:334] "Generic (PLEG): container finished" podID="8715f5a7-97a1-496d-be28-13c326b54135" containerID="187aece21bb2425b5bf5ce341d27bac916d46fddbd34aeac0734b4f84ad64e63" exitCode=0 Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.182621 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8715f5a7-97a1-496d-be28-13c326b54135","Type":"ContainerDied","Data":"187aece21bb2425b5bf5ce341d27bac916d46fddbd34aeac0734b4f84ad64e63"} Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.397961 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.429972 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715f5a7-97a1-496d-be28-13c326b54135-logs\") pod \"8715f5a7-97a1-496d-be28-13c326b54135\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.430261 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-config-data\") pod \"8715f5a7-97a1-496d-be28-13c326b54135\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.430377 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-combined-ca-bundle\") pod \"8715f5a7-97a1-496d-be28-13c326b54135\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.430432 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpktt\" (UniqueName: \"kubernetes.io/projected/8715f5a7-97a1-496d-be28-13c326b54135-kube-api-access-wpktt\") pod \"8715f5a7-97a1-496d-be28-13c326b54135\" (UID: \"8715f5a7-97a1-496d-be28-13c326b54135\") " Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.432246 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8715f5a7-97a1-496d-be28-13c326b54135-logs" (OuterVolumeSpecName: "logs") pod "8715f5a7-97a1-496d-be28-13c326b54135" (UID: "8715f5a7-97a1-496d-be28-13c326b54135"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.440872 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8715f5a7-97a1-496d-be28-13c326b54135-kube-api-access-wpktt" (OuterVolumeSpecName: "kube-api-access-wpktt") pod "8715f5a7-97a1-496d-be28-13c326b54135" (UID: "8715f5a7-97a1-496d-be28-13c326b54135"). InnerVolumeSpecName "kube-api-access-wpktt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.465110 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-config-data" (OuterVolumeSpecName: "config-data") pod "8715f5a7-97a1-496d-be28-13c326b54135" (UID: "8715f5a7-97a1-496d-be28-13c326b54135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.477248 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8715f5a7-97a1-496d-be28-13c326b54135" (UID: "8715f5a7-97a1-496d-be28-13c326b54135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.533130 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.533167 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715f5a7-97a1-496d-be28-13c326b54135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.533182 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpktt\" (UniqueName: \"kubernetes.io/projected/8715f5a7-97a1-496d-be28-13c326b54135-kube-api-access-wpktt\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:37 crc kubenswrapper[4721]: I0202 13:26:37.533190 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715f5a7-97a1-496d-be28-13c326b54135-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.192116 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8715f5a7-97a1-496d-be28-13c326b54135","Type":"ContainerDied","Data":"dda040b5b2dc6f664a770c8e322b217a952cd150043f7182ef1730acf2437842"} Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.192405 4721 scope.go:117] "RemoveContainer" containerID="187aece21bb2425b5bf5ce341d27bac916d46fddbd34aeac0734b4f84ad64e63" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.192567 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.219138 4721 scope.go:117] "RemoveContainer" containerID="e678d0355031514e26c0e876e17ad1d6766f6e13bf506ad8e8a465663e92ced1" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.240783 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.266013 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.292493 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 13:26:38 crc kubenswrapper[4721]: E0202 13:26:38.294429 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-log" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.294468 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-log" Feb 02 13:26:38 crc kubenswrapper[4721]: E0202 13:26:38.294501 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-api" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.294513 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-api" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.294804 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-log" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.294837 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="8715f5a7-97a1-496d-be28-13c326b54135" containerName="nova-api-api" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.296121 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.301850 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.308123 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.352845 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25b4g\" (UniqueName: \"kubernetes.io/projected/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-kube-api-access-25b4g\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.352943 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.352996 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-logs\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.353042 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-config-data\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.424650 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8715f5a7-97a1-496d-be28-13c326b54135" path="/var/lib/kubelet/pods/8715f5a7-97a1-496d-be28-13c326b54135/volumes" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.455746 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25b4g\" (UniqueName: \"kubernetes.io/projected/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-kube-api-access-25b4g\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.455939 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.456047 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-logs\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.456110 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-config-data\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.457393 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-logs\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.464492 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.475868 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25b4g\" (UniqueName: \"kubernetes.io/projected/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-kube-api-access-25b4g\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.475883 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-config-data\") pod \"nova-api-0\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " pod="openstack/nova-api-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.563117 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.564350 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 13:26:38 crc kubenswrapper[4721]: I0202 13:26:38.627399 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:26:39 crc kubenswrapper[4721]: I0202 13:26:39.194574 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:26:39 crc kubenswrapper[4721]: W0202 13:26:39.201516 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43a836c9_b9f7_4991_9cb4_db6dce6f8e08.slice/crio-4fca7f0bef96bd658599c1f0a2e8aa492cdcf6a60b42a42e88a85a2eec4a0be9 WatchSource:0}: Error finding container 4fca7f0bef96bd658599c1f0a2e8aa492cdcf6a60b42a42e88a85a2eec4a0be9: Status 404 returned error can't find the container with id 4fca7f0bef96bd658599c1f0a2e8aa492cdcf6a60b42a42e88a85a2eec4a0be9 Feb 02 13:26:39 crc kubenswrapper[4721]: I0202 13:26:39.638042 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 13:26:40 crc kubenswrapper[4721]: I0202 13:26:40.218665 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43a836c9-b9f7-4991-9cb4-db6dce6f8e08","Type":"ContainerStarted","Data":"cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e"} Feb 02 13:26:40 crc kubenswrapper[4721]: I0202 13:26:40.220306 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43a836c9-b9f7-4991-9cb4-db6dce6f8e08","Type":"ContainerStarted","Data":"59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04"} Feb 02 13:26:40 crc kubenswrapper[4721]: I0202 13:26:40.220419 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43a836c9-b9f7-4991-9cb4-db6dce6f8e08","Type":"ContainerStarted","Data":"4fca7f0bef96bd658599c1f0a2e8aa492cdcf6a60b42a42e88a85a2eec4a0be9"} Feb 02 13:26:40 crc kubenswrapper[4721]: I0202 13:26:40.245700 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.245678245 podStartE2EDuration="2.245678245s" podCreationTimestamp="2026-02-02 13:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:40.241028399 +0000 UTC m=+1540.543542788" watchObservedRunningTime="2026-02-02 13:26:40.245678245 +0000 UTC m=+1540.548192634" Feb 02 13:26:40 crc kubenswrapper[4721]: E0202 13:26:40.330628 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:41 crc kubenswrapper[4721]: E0202 13:26:41.078585 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:41 crc kubenswrapper[4721]: I0202 13:26:41.944941 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 13:26:42 crc kubenswrapper[4721]: I0202 13:26:42.471816 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 02 13:26:43 crc kubenswrapper[4721]: I0202 13:26:43.562784 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 13:26:43 crc kubenswrapper[4721]: I0202 13:26:43.562867 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 13:26:44 crc kubenswrapper[4721]: I0202 13:26:44.577376 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:26:44 crc kubenswrapper[4721]: I0202 13:26:44.578029 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:26:44 crc kubenswrapper[4721]: I0202 13:26:44.638248 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 13:26:44 crc kubenswrapper[4721]: I0202 13:26:44.670205 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 13:26:45 crc kubenswrapper[4721]: I0202 13:26:45.324489 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 13:26:46 crc kubenswrapper[4721]: I0202 13:26:46.459758 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 13:26:46 crc kubenswrapper[4721]: I0202 13:26:46.460518 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cc071000-a602-4de6-a9bc-1c93b6d58c25" containerName="kube-state-metrics" containerID="cri-o://28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3" gracePeriod=30 Feb 02 13:26:46 crc kubenswrapper[4721]: I0202 13:26:46.593711 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 13:26:46 crc kubenswrapper[4721]: I0202 13:26:46.593958 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="7a6930c7-1819-4b7d-baf6-773a8b68e568" containerName="mysqld-exporter" containerID="cri-o://425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73" gracePeriod=30 Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.113477 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.160137 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlrn8\" (UniqueName: \"kubernetes.io/projected/cc071000-a602-4de6-a9bc-1c93b6d58c25-kube-api-access-wlrn8\") pod \"cc071000-a602-4de6-a9bc-1c93b6d58c25\" (UID: \"cc071000-a602-4de6-a9bc-1c93b6d58c25\") " Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.168265 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc071000-a602-4de6-a9bc-1c93b6d58c25-kube-api-access-wlrn8" (OuterVolumeSpecName: "kube-api-access-wlrn8") pod "cc071000-a602-4de6-a9bc-1c93b6d58c25" (UID: "cc071000-a602-4de6-a9bc-1c93b6d58c25"). InnerVolumeSpecName "kube-api-access-wlrn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.241956 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.261653 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-combined-ca-bundle\") pod \"7a6930c7-1819-4b7d-baf6-773a8b68e568\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.261777 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-config-data\") pod \"7a6930c7-1819-4b7d-baf6-773a8b68e568\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.261852 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jz54\" (UniqueName: \"kubernetes.io/projected/7a6930c7-1819-4b7d-baf6-773a8b68e568-kube-api-access-8jz54\") pod \"7a6930c7-1819-4b7d-baf6-773a8b68e568\" (UID: \"7a6930c7-1819-4b7d-baf6-773a8b68e568\") " Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.262558 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlrn8\" (UniqueName: \"kubernetes.io/projected/cc071000-a602-4de6-a9bc-1c93b6d58c25-kube-api-access-wlrn8\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.276673 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6930c7-1819-4b7d-baf6-773a8b68e568-kube-api-access-8jz54" (OuterVolumeSpecName: "kube-api-access-8jz54") pod "7a6930c7-1819-4b7d-baf6-773a8b68e568" (UID: "7a6930c7-1819-4b7d-baf6-773a8b68e568"). InnerVolumeSpecName "kube-api-access-8jz54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.307268 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a6930c7-1819-4b7d-baf6-773a8b68e568" (UID: "7a6930c7-1819-4b7d-baf6-773a8b68e568"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.311772 4721 generic.go:334] "Generic (PLEG): container finished" podID="7a6930c7-1819-4b7d-baf6-773a8b68e568" containerID="425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73" exitCode=2 Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.311837 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7a6930c7-1819-4b7d-baf6-773a8b68e568","Type":"ContainerDied","Data":"425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73"} Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.311865 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7a6930c7-1819-4b7d-baf6-773a8b68e568","Type":"ContainerDied","Data":"29066421e6cce726a66f30e6952c937493f0f81dbe0ff9779f6c880b60322c1e"} Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.311881 4721 scope.go:117] "RemoveContainer" containerID="425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.312025 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.317770 4721 generic.go:334] "Generic (PLEG): container finished" podID="cc071000-a602-4de6-a9bc-1c93b6d58c25" containerID="28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3" exitCode=2 Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.317814 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc071000-a602-4de6-a9bc-1c93b6d58c25","Type":"ContainerDied","Data":"28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3"} Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.317840 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc071000-a602-4de6-a9bc-1c93b6d58c25","Type":"ContainerDied","Data":"f06dd476cbb6c5a0d98d77cc9568acefbefff19f14d266eab26513942d1c3774"} Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.317890 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.350516 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-config-data" (OuterVolumeSpecName: "config-data") pod "7a6930c7-1819-4b7d-baf6-773a8b68e568" (UID: "7a6930c7-1819-4b7d-baf6-773a8b68e568"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.352685 4721 scope.go:117] "RemoveContainer" containerID="425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73" Feb 02 13:26:47 crc kubenswrapper[4721]: E0202 13:26:47.354494 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73\": container with ID starting with 425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73 not found: ID does not exist" containerID="425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.354566 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73"} err="failed to get container status \"425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73\": rpc error: code = NotFound desc = could not find container \"425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73\": container with ID starting with 425d57884fbd57da1d35ea45143e44ac869dd71714bc4250dbb690863f08fa73 not found: ID does not exist" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.354648 4721 scope.go:117] "RemoveContainer" containerID="28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.366515 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.366926 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.366955 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a6930c7-1819-4b7d-baf6-773a8b68e568-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.366968 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jz54\" (UniqueName: \"kubernetes.io/projected/7a6930c7-1819-4b7d-baf6-773a8b68e568-kube-api-access-8jz54\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.383277 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.390769 4721 scope.go:117] "RemoveContainer" containerID="28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3" Feb 02 13:26:47 crc kubenswrapper[4721]: E0202 13:26:47.392887 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3\": container with ID starting with 28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3 not found: ID does not exist" containerID="28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.392945 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3"} err="failed to get container status \"28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3\": rpc error: code = NotFound desc = could not find container \"28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3\": container with ID starting with 28479592c480267adca57389ec0895d73f9fc0cf7e8ea4b979c6f7d9640013a3 not found: ID does not exist" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.399192 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 13:26:47 crc kubenswrapper[4721]: E0202 13:26:47.399719 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a6930c7-1819-4b7d-baf6-773a8b68e568" containerName="mysqld-exporter" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.399739 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6930c7-1819-4b7d-baf6-773a8b68e568" containerName="mysqld-exporter" Feb 02 13:26:47 crc kubenswrapper[4721]: E0202 13:26:47.399781 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc071000-a602-4de6-a9bc-1c93b6d58c25" containerName="kube-state-metrics" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.399791 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc071000-a602-4de6-a9bc-1c93b6d58c25" containerName="kube-state-metrics" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.399999 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a6930c7-1819-4b7d-baf6-773a8b68e568" containerName="mysqld-exporter" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.400042 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc071000-a602-4de6-a9bc-1c93b6d58c25" containerName="kube-state-metrics" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.400893 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.403192 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.405978 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.414558 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.468935 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac827915-eefd-428b-9303-581069f92ed8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.469060 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxpdj\" (UniqueName: \"kubernetes.io/projected/ac827915-eefd-428b-9303-581069f92ed8-kube-api-access-fxpdj\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.469138 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac827915-eefd-428b-9303-581069f92ed8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.469305 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ac827915-eefd-428b-9303-581069f92ed8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.571550 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ac827915-eefd-428b-9303-581069f92ed8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.571687 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac827915-eefd-428b-9303-581069f92ed8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.571756 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxpdj\" (UniqueName: \"kubernetes.io/projected/ac827915-eefd-428b-9303-581069f92ed8-kube-api-access-fxpdj\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.571803 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac827915-eefd-428b-9303-581069f92ed8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.575955 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ac827915-eefd-428b-9303-581069f92ed8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.577207 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac827915-eefd-428b-9303-581069f92ed8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.578581 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac827915-eefd-428b-9303-581069f92ed8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.597914 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxpdj\" (UniqueName: \"kubernetes.io/projected/ac827915-eefd-428b-9303-581069f92ed8-kube-api-access-fxpdj\") pod \"kube-state-metrics-0\" (UID: \"ac827915-eefd-428b-9303-581069f92ed8\") " pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.722331 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.800461 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.830323 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.850555 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.853191 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.858914 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.859187 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.864058 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.986507 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8abde028-43c5-4489-8de6-7c2da9f037c2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.986596 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2bbp\" (UniqueName: \"kubernetes.io/projected/8abde028-43c5-4489-8de6-7c2da9f037c2-kube-api-access-t2bbp\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.986690 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8abde028-43c5-4489-8de6-7c2da9f037c2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:47 crc kubenswrapper[4721]: I0202 13:26:47.986709 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8abde028-43c5-4489-8de6-7c2da9f037c2-config-data\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.089173 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8abde028-43c5-4489-8de6-7c2da9f037c2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.089219 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8abde028-43c5-4489-8de6-7c2da9f037c2-config-data\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.089361 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8abde028-43c5-4489-8de6-7c2da9f037c2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.089444 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2bbp\" (UniqueName: \"kubernetes.io/projected/8abde028-43c5-4489-8de6-7c2da9f037c2-kube-api-access-t2bbp\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.097244 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8abde028-43c5-4489-8de6-7c2da9f037c2-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.101863 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8abde028-43c5-4489-8de6-7c2da9f037c2-config-data\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.102706 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8abde028-43c5-4489-8de6-7c2da9f037c2-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.131828 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2bbp\" (UniqueName: \"kubernetes.io/projected/8abde028-43c5-4489-8de6-7c2da9f037c2-kube-api-access-t2bbp\") pod \"mysqld-exporter-0\" (UID: \"8abde028-43c5-4489-8de6-7c2da9f037c2\") " pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.212291 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.340374 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 13:26:48 crc kubenswrapper[4721]: E0202 13:26:48.360909 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:48 crc kubenswrapper[4721]: E0202 13:26:48.361413 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.432713 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a6930c7-1819-4b7d-baf6-773a8b68e568" path="/var/lib/kubelet/pods/7a6930c7-1819-4b7d-baf6-773a8b68e568/volumes" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.433357 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc071000-a602-4de6-a9bc-1c93b6d58c25" path="/var/lib/kubelet/pods/cc071000-a602-4de6-a9bc-1c93b6d58c25/volumes" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.629053 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.629411 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 13:26:48 crc kubenswrapper[4721]: I0202 13:26:48.840485 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.416313 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac827915-eefd-428b-9303-581069f92ed8","Type":"ContainerStarted","Data":"e2a207f99084f376858d13078ae9481c775747749fd195e00939bc5fb045a904"} Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.416617 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ac827915-eefd-428b-9303-581069f92ed8","Type":"ContainerStarted","Data":"38557ac7c7d354970f97a02935fc45e00a7133c6ffa25beb95cc1dab3b8d385f"} Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.418047 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.421041 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8abde028-43c5-4489-8de6-7c2da9f037c2","Type":"ContainerStarted","Data":"9df0c5b879f442ee98d22de6e236b64518fc82803847884f8e77d69a0544e020"} Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.437671 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.438096 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="ceilometer-central-agent" containerID="cri-o://941cea547a9c6139f4cd34b3cba1a9232469327b41a70627da4554d99e83c28b" gracePeriod=30 Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.438718 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="proxy-httpd" containerID="cri-o://4b856042309c84ea0a6e8c400ccf627a4542f6766de95577e1154ef8996d41d0" gracePeriod=30 Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.438797 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="sg-core" containerID="cri-o://2e794fb911c91ba362224447e9ecd49d098fbbb2f1fb15a45aedc118537561bc" gracePeriod=30 Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.438848 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="ceilometer-notification-agent" containerID="cri-o://3fc0efffdb16822dab9442b0b34ece997fbe26ebe5808aadd9adb66a693a7dd5" gracePeriod=30 Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.447466 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.070987255 podStartE2EDuration="2.447443595s" podCreationTimestamp="2026-02-02 13:26:47 +0000 UTC" firstStartedPulling="2026-02-02 13:26:48.372241245 +0000 UTC m=+1548.674755634" lastFinishedPulling="2026-02-02 13:26:48.748697585 +0000 UTC m=+1549.051211974" observedRunningTime="2026-02-02 13:26:49.437303621 +0000 UTC m=+1549.739818010" watchObservedRunningTime="2026-02-02 13:26:49.447443595 +0000 UTC m=+1549.749957994" Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.712253 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 13:26:49 crc kubenswrapper[4721]: I0202 13:26:49.712286 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 13:26:50 crc kubenswrapper[4721]: I0202 13:26:50.464044 4721 generic.go:334] "Generic (PLEG): container finished" podID="dd964db7-c2d3-477b-be71-60058c811541" containerID="4b856042309c84ea0a6e8c400ccf627a4542f6766de95577e1154ef8996d41d0" exitCode=0 Feb 02 13:26:50 crc kubenswrapper[4721]: I0202 13:26:50.464491 4721 generic.go:334] "Generic (PLEG): container finished" podID="dd964db7-c2d3-477b-be71-60058c811541" containerID="2e794fb911c91ba362224447e9ecd49d098fbbb2f1fb15a45aedc118537561bc" exitCode=2 Feb 02 13:26:50 crc kubenswrapper[4721]: I0202 13:26:50.464507 4721 generic.go:334] "Generic (PLEG): container finished" podID="dd964db7-c2d3-477b-be71-60058c811541" containerID="941cea547a9c6139f4cd34b3cba1a9232469327b41a70627da4554d99e83c28b" exitCode=0 Feb 02 13:26:50 crc kubenswrapper[4721]: I0202 13:26:50.464571 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerDied","Data":"4b856042309c84ea0a6e8c400ccf627a4542f6766de95577e1154ef8996d41d0"} Feb 02 13:26:50 crc kubenswrapper[4721]: I0202 13:26:50.464609 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerDied","Data":"2e794fb911c91ba362224447e9ecd49d098fbbb2f1fb15a45aedc118537561bc"} Feb 02 13:26:50 crc kubenswrapper[4721]: I0202 13:26:50.464625 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerDied","Data":"941cea547a9c6139f4cd34b3cba1a9232469327b41a70627da4554d99e83c28b"} Feb 02 13:26:50 crc kubenswrapper[4721]: I0202 13:26:50.468000 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8abde028-43c5-4489-8de6-7c2da9f037c2","Type":"ContainerStarted","Data":"5d7454ef92695803ec7743b8396827a94b86a25fb897e83e2a1701b15a668f42"} Feb 02 13:26:50 crc kubenswrapper[4721]: I0202 13:26:50.495358 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.840500845 podStartE2EDuration="3.495334875s" podCreationTimestamp="2026-02-02 13:26:47 +0000 UTC" firstStartedPulling="2026-02-02 13:26:48.856525949 +0000 UTC m=+1549.159040328" lastFinishedPulling="2026-02-02 13:26:49.511359969 +0000 UTC m=+1549.813874358" observedRunningTime="2026-02-02 13:26:50.486979698 +0000 UTC m=+1550.789494077" watchObservedRunningTime="2026-02-02 13:26:50.495334875 +0000 UTC m=+1550.797849274" Feb 02 13:26:51 crc kubenswrapper[4721]: E0202 13:26:51.139528 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.500340 4721 generic.go:334] "Generic (PLEG): container finished" podID="dd964db7-c2d3-477b-be71-60058c811541" containerID="3fc0efffdb16822dab9442b0b34ece997fbe26ebe5808aadd9adb66a693a7dd5" exitCode=0 Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.500422 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerDied","Data":"3fc0efffdb16822dab9442b0b34ece997fbe26ebe5808aadd9adb66a693a7dd5"} Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.500679 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd964db7-c2d3-477b-be71-60058c811541","Type":"ContainerDied","Data":"72f8e4545197a36201cfdd4ca1128b7b9319c774b1a3e5c0ba44e89faa242906"} Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.500694 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f8e4545197a36201cfdd4ca1128b7b9319c774b1a3e5c0ba44e89faa242906" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.569945 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.575461 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.576824 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.579545 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.736106 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-log-httpd\") pod \"dd964db7-c2d3-477b-be71-60058c811541\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.736201 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlqf6\" (UniqueName: \"kubernetes.io/projected/dd964db7-c2d3-477b-be71-60058c811541-kube-api-access-rlqf6\") pod \"dd964db7-c2d3-477b-be71-60058c811541\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.736244 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-scripts\") pod \"dd964db7-c2d3-477b-be71-60058c811541\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.736417 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-sg-core-conf-yaml\") pod \"dd964db7-c2d3-477b-be71-60058c811541\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.736593 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-config-data\") pod \"dd964db7-c2d3-477b-be71-60058c811541\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.736649 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd964db7-c2d3-477b-be71-60058c811541" (UID: "dd964db7-c2d3-477b-be71-60058c811541"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.736686 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-run-httpd\") pod \"dd964db7-c2d3-477b-be71-60058c811541\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.736731 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-combined-ca-bundle\") pod \"dd964db7-c2d3-477b-be71-60058c811541\" (UID: \"dd964db7-c2d3-477b-be71-60058c811541\") " Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.736907 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd964db7-c2d3-477b-be71-60058c811541" (UID: "dd964db7-c2d3-477b-be71-60058c811541"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.737423 4721 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.737442 4721 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd964db7-c2d3-477b-be71-60058c811541-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.741826 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-scripts" (OuterVolumeSpecName: "scripts") pod "dd964db7-c2d3-477b-be71-60058c811541" (UID: "dd964db7-c2d3-477b-be71-60058c811541"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.743726 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd964db7-c2d3-477b-be71-60058c811541-kube-api-access-rlqf6" (OuterVolumeSpecName: "kube-api-access-rlqf6") pod "dd964db7-c2d3-477b-be71-60058c811541" (UID: "dd964db7-c2d3-477b-be71-60058c811541"). InnerVolumeSpecName "kube-api-access-rlqf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.779594 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd964db7-c2d3-477b-be71-60058c811541" (UID: "dd964db7-c2d3-477b-be71-60058c811541"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.839375 4721 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.839437 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlqf6\" (UniqueName: \"kubernetes.io/projected/dd964db7-c2d3-477b-be71-60058c811541-kube-api-access-rlqf6\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.839452 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.848312 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd964db7-c2d3-477b-be71-60058c811541" (UID: "dd964db7-c2d3-477b-be71-60058c811541"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.861256 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-config-data" (OuterVolumeSpecName: "config-data") pod "dd964db7-c2d3-477b-be71-60058c811541" (UID: "dd964db7-c2d3-477b-be71-60058c811541"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.941831 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:53 crc kubenswrapper[4721]: I0202 13:26:53.941866 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd964db7-c2d3-477b-be71-60058c811541-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.510987 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.519146 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.537184 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.548485 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.588666 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:54 crc kubenswrapper[4721]: E0202 13:26:54.589573 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="ceilometer-notification-agent" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.589599 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="ceilometer-notification-agent" Feb 02 13:26:54 crc kubenswrapper[4721]: E0202 13:26:54.589644 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="proxy-httpd" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.589653 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="proxy-httpd" Feb 02 13:26:54 crc kubenswrapper[4721]: E0202 13:26:54.589682 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="ceilometer-central-agent" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.589692 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="ceilometer-central-agent" Feb 02 13:26:54 crc kubenswrapper[4721]: E0202 13:26:54.589712 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="sg-core" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.589722 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="sg-core" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.590013 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="proxy-httpd" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.590038 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="sg-core" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.590095 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="ceilometer-notification-agent" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.590109 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd964db7-c2d3-477b-be71-60058c811541" containerName="ceilometer-central-agent" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.592865 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.597144 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.597375 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.597541 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.625723 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.661816 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-scripts\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.662247 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.662508 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.662668 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-config-data\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.662726 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s9dq\" (UniqueName: \"kubernetes.io/projected/aff077fb-9974-49d0-a292-6ec2e865fb66-kube-api-access-4s9dq\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.662767 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-log-httpd\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.662813 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.662900 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-run-httpd\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.764952 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-config-data\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.765021 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s9dq\" (UniqueName: \"kubernetes.io/projected/aff077fb-9974-49d0-a292-6ec2e865fb66-kube-api-access-4s9dq\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.765053 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-log-httpd\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.765103 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.765161 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-run-httpd\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.765226 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-scripts\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.765362 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.765457 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.767026 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-log-httpd\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.767472 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-run-httpd\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.770447 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.771088 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.771601 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.772448 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-config-data\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.772697 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-scripts\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.790411 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s9dq\" (UniqueName: \"kubernetes.io/projected/aff077fb-9974-49d0-a292-6ec2e865fb66-kube-api-access-4s9dq\") pod \"ceilometer-0\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " pod="openstack/ceilometer-0" Feb 02 13:26:54 crc kubenswrapper[4721]: I0202 13:26:54.918033 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:26:55 crc kubenswrapper[4721]: E0202 13:26:55.340130 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b253a_c7c6_4c9e_a800_a1732ba06f37.slice/crio-5b3069f620076da85927b973fa828028181bff883e33dc0cd16ab04434fa94e6\": RecentStats: unable to find data in memory cache]" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.438447 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.509795 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-combined-ca-bundle\") pod \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.509838 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-config-data\") pod \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.509966 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh8l7\" (UniqueName: \"kubernetes.io/projected/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-kube-api-access-nh8l7\") pod \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\" (UID: \"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e\") " Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.525477 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-kube-api-access-nh8l7" (OuterVolumeSpecName: "kube-api-access-nh8l7") pod "20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e" (UID: "20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e"). InnerVolumeSpecName "kube-api-access-nh8l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.542645 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e" (UID: "20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.544930 4721 generic.go:334] "Generic (PLEG): container finished" podID="20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e" containerID="39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a" exitCode=137 Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.545078 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e","Type":"ContainerDied","Data":"39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a"} Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.545144 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e","Type":"ContainerDied","Data":"b40b44f42b58694b1e7abb93930ae837bfaf27b3b7a9cd3931ed69ef1a81d994"} Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.545167 4721 scope.go:117] "RemoveContainer" containerID="39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.545037 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.557195 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-config-data" (OuterVolumeSpecName: "config-data") pod "20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e" (UID: "20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.613841 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.613877 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.613891 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh8l7\" (UniqueName: \"kubernetes.io/projected/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e-kube-api-access-nh8l7\") on node \"crc\" DevicePath \"\"" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.663244 4721 scope.go:117] "RemoveContainer" containerID="39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a" Feb 02 13:26:55 crc kubenswrapper[4721]: E0202 13:26:55.663702 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a\": container with ID starting with 39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a not found: ID does not exist" containerID="39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.663756 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a"} err="failed to get container status \"39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a\": rpc error: code = NotFound desc = could not find container \"39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a\": container with ID starting with 39bda0040f1cfe72cdc0ea0bc846c40e1727cedfc5ccf95841b81f4bbd9ed35a not found: ID does not exist" Feb 02 13:26:55 crc kubenswrapper[4721]: W0202 13:26:55.742993 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaff077fb_9974_49d0_a292_6ec2e865fb66.slice/crio-0f5022370d32f82b4c12f27c9216defd2b800b46bb0788cdc6b534feca22926c WatchSource:0}: Error finding container 0f5022370d32f82b4c12f27c9216defd2b800b46bb0788cdc6b534feca22926c: Status 404 returned error can't find the container with id 0f5022370d32f82b4c12f27c9216defd2b800b46bb0788cdc6b534feca22926c Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.749424 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.886474 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.902485 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.914194 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 13:26:55 crc kubenswrapper[4721]: E0202 13:26:55.915136 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.915171 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.915469 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.917096 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.919406 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.920008 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.921134 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 02 13:26:55 crc kubenswrapper[4721]: I0202 13:26:55.926282 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.021930 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.022014 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.022112 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.022194 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.022252 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2cwt\" (UniqueName: \"kubernetes.io/projected/ab8f3d4c-b262-4b71-a934-f584c1f07790-kube-api-access-s2cwt\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.124279 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.124373 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.124466 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.124528 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2cwt\" (UniqueName: \"kubernetes.io/projected/ab8f3d4c-b262-4b71-a934-f584c1f07790-kube-api-access-s2cwt\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.124583 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.133032 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.133059 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.133126 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.133548 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab8f3d4c-b262-4b71-a934-f584c1f07790-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.142657 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2cwt\" (UniqueName: \"kubernetes.io/projected/ab8f3d4c-b262-4b71-a934-f584c1f07790-kube-api-access-s2cwt\") pod \"nova-cell1-novncproxy-0\" (UID: \"ab8f3d4c-b262-4b71-a934-f584c1f07790\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.241251 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.428321 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e" path="/var/lib/kubelet/pods/20108eee-c30d-4d6b-b3d1-bbebfe4e9a0e/volumes" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.430715 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd964db7-c2d3-477b-be71-60058c811541" path="/var/lib/kubelet/pods/dd964db7-c2d3-477b-be71-60058c811541/volumes" Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.561715 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerStarted","Data":"0f5022370d32f82b4c12f27c9216defd2b800b46bb0788cdc6b534feca22926c"} Feb 02 13:26:56 crc kubenswrapper[4721]: I0202 13:26:56.766002 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 13:26:56 crc kubenswrapper[4721]: W0202 13:26:56.768814 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab8f3d4c_b262_4b71_a934_f584c1f07790.slice/crio-cdfe31faf5b0bb74e1240fa508841b91b1fcd38c1013c3456b6f88b75cf2e3c3 WatchSource:0}: Error finding container cdfe31faf5b0bb74e1240fa508841b91b1fcd38c1013c3456b6f88b75cf2e3c3: Status 404 returned error can't find the container with id cdfe31faf5b0bb74e1240fa508841b91b1fcd38c1013c3456b6f88b75cf2e3c3 Feb 02 13:26:57 crc kubenswrapper[4721]: I0202 13:26:57.576029 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerStarted","Data":"21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db"} Feb 02 13:26:57 crc kubenswrapper[4721]: I0202 13:26:57.576400 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerStarted","Data":"7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8"} Feb 02 13:26:57 crc kubenswrapper[4721]: I0202 13:26:57.578755 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ab8f3d4c-b262-4b71-a934-f584c1f07790","Type":"ContainerStarted","Data":"f898c6d26001e92c4ce65251bab93bd8ae9829ada56e10c907665d13d9cac82c"} Feb 02 13:26:57 crc kubenswrapper[4721]: I0202 13:26:57.578796 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ab8f3d4c-b262-4b71-a934-f584c1f07790","Type":"ContainerStarted","Data":"cdfe31faf5b0bb74e1240fa508841b91b1fcd38c1013c3456b6f88b75cf2e3c3"} Feb 02 13:26:57 crc kubenswrapper[4721]: I0202 13:26:57.603731 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.603710141 podStartE2EDuration="2.603710141s" podCreationTimestamp="2026-02-02 13:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:26:57.595513169 +0000 UTC m=+1557.898027558" watchObservedRunningTime="2026-02-02 13:26:57.603710141 +0000 UTC m=+1557.906224530" Feb 02 13:26:57 crc kubenswrapper[4721]: I0202 13:26:57.734745 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 13:26:58 crc kubenswrapper[4721]: I0202 13:26:58.637010 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerStarted","Data":"948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28"} Feb 02 13:26:58 crc kubenswrapper[4721]: I0202 13:26:58.642589 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 13:26:58 crc kubenswrapper[4721]: I0202 13:26:58.642955 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 13:26:58 crc kubenswrapper[4721]: I0202 13:26:58.646371 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 13:26:58 crc kubenswrapper[4721]: I0202 13:26:58.647600 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.647856 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.651241 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.845855 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5"] Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.848489 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.854721 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5"] Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.917990 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.918131 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.918166 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-config\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.918193 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.918220 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65lcl\" (UniqueName: \"kubernetes.io/projected/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-kube-api-access-65lcl\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:26:59 crc kubenswrapper[4721]: I0202 13:26:59.918314 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.020491 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.020570 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.020668 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.020703 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-config\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.020729 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.020762 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65lcl\" (UniqueName: \"kubernetes.io/projected/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-kube-api-access-65lcl\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.021594 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.021649 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.021728 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.021728 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.022412 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-config\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.039797 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65lcl\" (UniqueName: \"kubernetes.io/projected/1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b-kube-api-access-65lcl\") pod \"dnsmasq-dns-6b7bbf7cf9-fz8n5\" (UID: \"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.184405 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:00 crc kubenswrapper[4721]: I0202 13:27:00.804264 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5"] Feb 02 13:27:01 crc kubenswrapper[4721]: I0202 13:27:01.242332 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:27:01 crc kubenswrapper[4721]: I0202 13:27:01.700035 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerStarted","Data":"841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857"} Feb 02 13:27:01 crc kubenswrapper[4721]: I0202 13:27:01.700500 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 13:27:01 crc kubenswrapper[4721]: I0202 13:27:01.702870 4721 generic.go:334] "Generic (PLEG): container finished" podID="1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b" containerID="c693a0278e26b7b8778cdf577503d9b8dcaf5fec288f6574f9f36e14a5f3bf1a" exitCode=0 Feb 02 13:27:01 crc kubenswrapper[4721]: I0202 13:27:01.703710 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" event={"ID":"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b","Type":"ContainerDied","Data":"c693a0278e26b7b8778cdf577503d9b8dcaf5fec288f6574f9f36e14a5f3bf1a"} Feb 02 13:27:01 crc kubenswrapper[4721]: I0202 13:27:01.703737 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" event={"ID":"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b","Type":"ContainerStarted","Data":"46b041a1c8868b769f0e6d94df9476e5e601bc8f9599ee2e011a89fba3ef680f"} Feb 02 13:27:01 crc kubenswrapper[4721]: I0202 13:27:01.742296 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9654303 podStartE2EDuration="7.742273723s" podCreationTimestamp="2026-02-02 13:26:54 +0000 UTC" firstStartedPulling="2026-02-02 13:26:55.745443493 +0000 UTC m=+1556.047957902" lastFinishedPulling="2026-02-02 13:27:00.522286926 +0000 UTC m=+1560.824801325" observedRunningTime="2026-02-02 13:27:01.736738703 +0000 UTC m=+1562.039253102" watchObservedRunningTime="2026-02-02 13:27:01.742273723 +0000 UTC m=+1562.044788102" Feb 02 13:27:02 crc kubenswrapper[4721]: I0202 13:27:02.517434 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:02 crc kubenswrapper[4721]: I0202 13:27:02.717095 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" event={"ID":"1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b","Type":"ContainerStarted","Data":"dbc6deced1d62c44c9d8f674e3578530eef49c5eab475dff801a9bd1c8e471f4"} Feb 02 13:27:02 crc kubenswrapper[4721]: I0202 13:27:02.718273 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-log" containerID="cri-o://59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04" gracePeriod=30 Feb 02 13:27:02 crc kubenswrapper[4721]: I0202 13:27:02.718426 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-api" containerID="cri-o://cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e" gracePeriod=30 Feb 02 13:27:02 crc kubenswrapper[4721]: I0202 13:27:02.719365 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:02 crc kubenswrapper[4721]: I0202 13:27:02.750119 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" podStartSLOduration=3.750098437 podStartE2EDuration="3.750098437s" podCreationTimestamp="2026-02-02 13:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:27:02.746671283 +0000 UTC m=+1563.049185682" watchObservedRunningTime="2026-02-02 13:27:02.750098437 +0000 UTC m=+1563.052612826" Feb 02 13:27:03 crc kubenswrapper[4721]: I0202 13:27:03.754652 4721 generic.go:334] "Generic (PLEG): container finished" podID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerID="59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04" exitCode=143 Feb 02 13:27:03 crc kubenswrapper[4721]: I0202 13:27:03.754697 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43a836c9-b9f7-4991-9cb4-db6dce6f8e08","Type":"ContainerDied","Data":"59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04"} Feb 02 13:27:03 crc kubenswrapper[4721]: I0202 13:27:03.885222 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:27:03 crc kubenswrapper[4721]: I0202 13:27:03.886679 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="ceilometer-central-agent" containerID="cri-o://7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8" gracePeriod=30 Feb 02 13:27:03 crc kubenswrapper[4721]: I0202 13:27:03.886907 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="proxy-httpd" containerID="cri-o://841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857" gracePeriod=30 Feb 02 13:27:03 crc kubenswrapper[4721]: I0202 13:27:03.887101 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="sg-core" containerID="cri-o://948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28" gracePeriod=30 Feb 02 13:27:03 crc kubenswrapper[4721]: I0202 13:27:03.887132 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="ceilometer-notification-agent" containerID="cri-o://21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db" gracePeriod=30 Feb 02 13:27:04 crc kubenswrapper[4721]: I0202 13:27:04.768953 4721 generic.go:334] "Generic (PLEG): container finished" podID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerID="841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857" exitCode=0 Feb 02 13:27:04 crc kubenswrapper[4721]: I0202 13:27:04.769304 4721 generic.go:334] "Generic (PLEG): container finished" podID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerID="948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28" exitCode=2 Feb 02 13:27:04 crc kubenswrapper[4721]: I0202 13:27:04.769318 4721 generic.go:334] "Generic (PLEG): container finished" podID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerID="21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db" exitCode=0 Feb 02 13:27:04 crc kubenswrapper[4721]: I0202 13:27:04.770544 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerDied","Data":"841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857"} Feb 02 13:27:04 crc kubenswrapper[4721]: I0202 13:27:04.770579 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerDied","Data":"948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28"} Feb 02 13:27:04 crc kubenswrapper[4721]: I0202 13:27:04.770594 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerDied","Data":"21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db"} Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.518146 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.643671 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-scripts\") pod \"aff077fb-9974-49d0-a292-6ec2e865fb66\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.643761 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-sg-core-conf-yaml\") pod \"aff077fb-9974-49d0-a292-6ec2e865fb66\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.643881 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-ceilometer-tls-certs\") pod \"aff077fb-9974-49d0-a292-6ec2e865fb66\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.643936 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-log-httpd\") pod \"aff077fb-9974-49d0-a292-6ec2e865fb66\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.643991 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-run-httpd\") pod \"aff077fb-9974-49d0-a292-6ec2e865fb66\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.644036 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-config-data\") pod \"aff077fb-9974-49d0-a292-6ec2e865fb66\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.644631 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-combined-ca-bundle\") pod \"aff077fb-9974-49d0-a292-6ec2e865fb66\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.644544 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aff077fb-9974-49d0-a292-6ec2e865fb66" (UID: "aff077fb-9974-49d0-a292-6ec2e865fb66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.644968 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s9dq\" (UniqueName: \"kubernetes.io/projected/aff077fb-9974-49d0-a292-6ec2e865fb66-kube-api-access-4s9dq\") pod \"aff077fb-9974-49d0-a292-6ec2e865fb66\" (UID: \"aff077fb-9974-49d0-a292-6ec2e865fb66\") " Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.644560 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aff077fb-9974-49d0-a292-6ec2e865fb66" (UID: "aff077fb-9974-49d0-a292-6ec2e865fb66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.646002 4721 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.646024 4721 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aff077fb-9974-49d0-a292-6ec2e865fb66-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.649767 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aff077fb-9974-49d0-a292-6ec2e865fb66-kube-api-access-4s9dq" (OuterVolumeSpecName: "kube-api-access-4s9dq") pod "aff077fb-9974-49d0-a292-6ec2e865fb66" (UID: "aff077fb-9974-49d0-a292-6ec2e865fb66"). InnerVolumeSpecName "kube-api-access-4s9dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.649954 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-scripts" (OuterVolumeSpecName: "scripts") pod "aff077fb-9974-49d0-a292-6ec2e865fb66" (UID: "aff077fb-9974-49d0-a292-6ec2e865fb66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.680615 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aff077fb-9974-49d0-a292-6ec2e865fb66" (UID: "aff077fb-9974-49d0-a292-6ec2e865fb66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.714335 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "aff077fb-9974-49d0-a292-6ec2e865fb66" (UID: "aff077fb-9974-49d0-a292-6ec2e865fb66"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.748733 4721 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.748774 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s9dq\" (UniqueName: \"kubernetes.io/projected/aff077fb-9974-49d0-a292-6ec2e865fb66-kube-api-access-4s9dq\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.748787 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.748798 4721 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.752299 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aff077fb-9974-49d0-a292-6ec2e865fb66" (UID: "aff077fb-9974-49d0-a292-6ec2e865fb66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.768699 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-config-data" (OuterVolumeSpecName: "config-data") pod "aff077fb-9974-49d0-a292-6ec2e865fb66" (UID: "aff077fb-9974-49d0-a292-6ec2e865fb66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.782607 4721 generic.go:334] "Generic (PLEG): container finished" podID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerID="7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8" exitCode=0 Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.782658 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerDied","Data":"7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8"} Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.782668 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.782697 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aff077fb-9974-49d0-a292-6ec2e865fb66","Type":"ContainerDied","Data":"0f5022370d32f82b4c12f27c9216defd2b800b46bb0788cdc6b534feca22926c"} Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.782718 4721 scope.go:117] "RemoveContainer" containerID="841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.824703 4721 scope.go:117] "RemoveContainer" containerID="948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.851818 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.851857 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aff077fb-9974-49d0-a292-6ec2e865fb66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.851886 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.855796 4721 scope.go:117] "RemoveContainer" containerID="21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.871101 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.884027 4721 scope.go:117] "RemoveContainer" containerID="7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.884613 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:27:05 crc kubenswrapper[4721]: E0202 13:27:05.885361 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="ceilometer-notification-agent" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.885385 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="ceilometer-notification-agent" Feb 02 13:27:05 crc kubenswrapper[4721]: E0202 13:27:05.885416 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="ceilometer-central-agent" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.885427 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="ceilometer-central-agent" Feb 02 13:27:05 crc kubenswrapper[4721]: E0202 13:27:05.885451 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="sg-core" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.885461 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="sg-core" Feb 02 13:27:05 crc kubenswrapper[4721]: E0202 13:27:05.885483 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="proxy-httpd" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.885491 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="proxy-httpd" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.885701 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="ceilometer-notification-agent" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.885733 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="sg-core" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.885746 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="ceilometer-central-agent" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.885965 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" containerName="proxy-httpd" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.888244 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.890196 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.890472 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.890590 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.900600 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.915630 4721 scope.go:117] "RemoveContainer" containerID="841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857" Feb 02 13:27:05 crc kubenswrapper[4721]: E0202 13:27:05.916056 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857\": container with ID starting with 841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857 not found: ID does not exist" containerID="841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.916100 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857"} err="failed to get container status \"841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857\": rpc error: code = NotFound desc = could not find container \"841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857\": container with ID starting with 841b1d028b76ecebf6456a6190bf54216a9623cac4339240f3c8ea1ce8bad857 not found: ID does not exist" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.916120 4721 scope.go:117] "RemoveContainer" containerID="948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28" Feb 02 13:27:05 crc kubenswrapper[4721]: E0202 13:27:05.916432 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28\": container with ID starting with 948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28 not found: ID does not exist" containerID="948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.916564 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28"} err="failed to get container status \"948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28\": rpc error: code = NotFound desc = could not find container \"948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28\": container with ID starting with 948f6018527aef7a4550fc8c54317e9398a8dbb4dac55b06acf1eae76dacca28 not found: ID does not exist" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.916579 4721 scope.go:117] "RemoveContainer" containerID="21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db" Feb 02 13:27:05 crc kubenswrapper[4721]: E0202 13:27:05.916898 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db\": container with ID starting with 21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db not found: ID does not exist" containerID="21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.916919 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db"} err="failed to get container status \"21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db\": rpc error: code = NotFound desc = could not find container \"21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db\": container with ID starting with 21ff204c5ac14a91dcf68f01d3b2281bff194d3d4953fe8e4f71719ba097f0db not found: ID does not exist" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.916931 4721 scope.go:117] "RemoveContainer" containerID="7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8" Feb 02 13:27:05 crc kubenswrapper[4721]: E0202 13:27:05.917237 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8\": container with ID starting with 7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8 not found: ID does not exist" containerID="7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.917259 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8"} err="failed to get container status \"7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8\": rpc error: code = NotFound desc = could not find container \"7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8\": container with ID starting with 7c6cd5845fd6326a60d04161ca6495067b38e25fced7ab0983a3a262f70554d8 not found: ID does not exist" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.953425 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.953517 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c678f02b-cbee-4578-9e28-067b63af2682-log-httpd\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.953581 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxsfc\" (UniqueName: \"kubernetes.io/projected/c678f02b-cbee-4578-9e28-067b63af2682-kube-api-access-fxsfc\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.953667 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-scripts\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.953711 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.953746 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c678f02b-cbee-4578-9e28-067b63af2682-run-httpd\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.953778 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:05 crc kubenswrapper[4721]: I0202 13:27:05.953916 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-config-data\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.055973 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-scripts\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.056041 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.056088 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c678f02b-cbee-4578-9e28-067b63af2682-run-httpd\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.056110 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.056140 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-config-data\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.056219 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.056907 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c678f02b-cbee-4578-9e28-067b63af2682-log-httpd\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.056976 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxsfc\" (UniqueName: \"kubernetes.io/projected/c678f02b-cbee-4578-9e28-067b63af2682-kube-api-access-fxsfc\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.057208 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c678f02b-cbee-4578-9e28-067b63af2682-run-httpd\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.057316 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c678f02b-cbee-4578-9e28-067b63af2682-log-httpd\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.060461 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.060546 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.061735 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-scripts\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.062010 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-config-data\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.062880 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c678f02b-cbee-4578-9e28-067b63af2682-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.078912 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxsfc\" (UniqueName: \"kubernetes.io/projected/c678f02b-cbee-4578-9e28-067b63af2682-kube-api-access-fxsfc\") pod \"ceilometer-0\" (UID: \"c678f02b-cbee-4578-9e28-067b63af2682\") " pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.205789 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.242011 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.278653 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.422971 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aff077fb-9974-49d0-a292-6ec2e865fb66" path="/var/lib/kubelet/pods/aff077fb-9974-49d0-a292-6ec2e865fb66/volumes" Feb 02 13:27:06 crc kubenswrapper[4721]: W0202 13:27:06.661662 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc678f02b_cbee_4578_9e28_067b63af2682.slice/crio-196c1ad24ffe9fbb8c6037d1f0687a36a8874420cbb3c304dbf4cdf92e555e7b WatchSource:0}: Error finding container 196c1ad24ffe9fbb8c6037d1f0687a36a8874420cbb3c304dbf4cdf92e555e7b: Status 404 returned error can't find the container with id 196c1ad24ffe9fbb8c6037d1f0687a36a8874420cbb3c304dbf4cdf92e555e7b Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.664275 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.773213 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.813218 4721 generic.go:334] "Generic (PLEG): container finished" podID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerID="cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e" exitCode=0 Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.813283 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43a836c9-b9f7-4991-9cb4-db6dce6f8e08","Type":"ContainerDied","Data":"cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e"} Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.813308 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43a836c9-b9f7-4991-9cb4-db6dce6f8e08","Type":"ContainerDied","Data":"4fca7f0bef96bd658599c1f0a2e8aa492cdcf6a60b42a42e88a85a2eec4a0be9"} Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.813325 4721 scope.go:117] "RemoveContainer" containerID="cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.813431 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.826591 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c678f02b-cbee-4578-9e28-067b63af2682","Type":"ContainerStarted","Data":"196c1ad24ffe9fbb8c6037d1f0687a36a8874420cbb3c304dbf4cdf92e555e7b"} Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.863005 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.876115 4721 scope.go:117] "RemoveContainer" containerID="59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.880080 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-logs\") pod \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.880367 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25b4g\" (UniqueName: \"kubernetes.io/projected/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-kube-api-access-25b4g\") pod \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.884867 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-combined-ca-bundle\") pod \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.884926 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-config-data\") pod \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\" (UID: \"43a836c9-b9f7-4991-9cb4-db6dce6f8e08\") " Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.887322 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-logs" (OuterVolumeSpecName: "logs") pod "43a836c9-b9f7-4991-9cb4-db6dce6f8e08" (UID: "43a836c9-b9f7-4991-9cb4-db6dce6f8e08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.902819 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-kube-api-access-25b4g" (OuterVolumeSpecName: "kube-api-access-25b4g") pod "43a836c9-b9f7-4991-9cb4-db6dce6f8e08" (UID: "43a836c9-b9f7-4991-9cb4-db6dce6f8e08"). InnerVolumeSpecName "kube-api-access-25b4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.919432 4721 scope.go:117] "RemoveContainer" containerID="cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e" Feb 02 13:27:06 crc kubenswrapper[4721]: E0202 13:27:06.920839 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e\": container with ID starting with cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e not found: ID does not exist" containerID="cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.920878 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e"} err="failed to get container status \"cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e\": rpc error: code = NotFound desc = could not find container \"cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e\": container with ID starting with cb14a013f6e5ea8bc11bf1264095a5d3c6d4ad516551071c013c66ca9a01bd5e not found: ID does not exist" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.920907 4721 scope.go:117] "RemoveContainer" containerID="59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04" Feb 02 13:27:06 crc kubenswrapper[4721]: E0202 13:27:06.921966 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04\": container with ID starting with 59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04 not found: ID does not exist" containerID="59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.922007 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04"} err="failed to get container status \"59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04\": rpc error: code = NotFound desc = could not find container \"59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04\": container with ID starting with 59c81dc2121b1526d3fc9cd11a40475f9cc813f682849fa4432ebb8a77fa2a04 not found: ID does not exist" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.946257 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43a836c9-b9f7-4991-9cb4-db6dce6f8e08" (UID: "43a836c9-b9f7-4991-9cb4-db6dce6f8e08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.951308 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-config-data" (OuterVolumeSpecName: "config-data") pod "43a836c9-b9f7-4991-9cb4-db6dce6f8e08" (UID: "43a836c9-b9f7-4991-9cb4-db6dce6f8e08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.991445 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.991650 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.991740 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:06 crc kubenswrapper[4721]: I0202 13:27:06.991807 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25b4g\" (UniqueName: \"kubernetes.io/projected/43a836c9-b9f7-4991-9cb4-db6dce6f8e08-kube-api-access-25b4g\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.024944 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-crdjt"] Feb 02 13:27:07 crc kubenswrapper[4721]: E0202 13:27:07.025624 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-api" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.025648 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-api" Feb 02 13:27:07 crc kubenswrapper[4721]: E0202 13:27:07.025670 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-log" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.025681 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-log" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.025978 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-api" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.026004 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" containerName="nova-api-log" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.027139 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.030621 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.030846 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.055503 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-crdjt"] Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.094379 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.094668 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-scripts\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.094779 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6svz\" (UniqueName: \"kubernetes.io/projected/27df0911-fe79-4339-a6fe-cf538f97a247-kube-api-access-m6svz\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.094909 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-config-data\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.197847 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.198057 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-scripts\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.198140 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6svz\" (UniqueName: \"kubernetes.io/projected/27df0911-fe79-4339-a6fe-cf538f97a247-kube-api-access-m6svz\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.198229 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-config-data\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.203198 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-config-data\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.203363 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-scripts\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.219989 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.222349 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6svz\" (UniqueName: \"kubernetes.io/projected/27df0911-fe79-4339-a6fe-cf538f97a247-kube-api-access-m6svz\") pod \"nova-cell1-cell-mapping-crdjt\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.338989 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.364179 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.382833 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.385698 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.391743 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.392028 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.392308 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.401851 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.433172 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.508341 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.508413 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rww68\" (UniqueName: \"kubernetes.io/projected/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-kube-api-access-rww68\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.508523 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.508618 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-config-data\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.508818 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-public-tls-certs\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.508893 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-logs\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.612817 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-config-data\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.613202 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-public-tls-certs\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.613237 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-logs\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.613417 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.613441 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rww68\" (UniqueName: \"kubernetes.io/projected/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-kube-api-access-rww68\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.613483 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.613855 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-logs\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.617840 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.619385 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-config-data\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.619423 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.621466 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-public-tls-certs\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.634024 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rww68\" (UniqueName: \"kubernetes.io/projected/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-kube-api-access-rww68\") pod \"nova-api-0\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.708570 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:27:07 crc kubenswrapper[4721]: I0202 13:27:07.961609 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c678f02b-cbee-4578-9e28-067b63af2682","Type":"ContainerStarted","Data":"0528daf98695dfbeae8802550b4ef40b0a4707fa1f775f8e1fb16ee705df3595"} Feb 02 13:27:08 crc kubenswrapper[4721]: I0202 13:27:08.046260 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-crdjt"] Feb 02 13:27:08 crc kubenswrapper[4721]: W0202 13:27:08.262269 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod136a6410_a20a_4e6a_bbc4_aaa3634e8af7.slice/crio-c6c02957f5278ef86c39592f8b9cb95239b0996358dc335274d947ae3f698b69 WatchSource:0}: Error finding container c6c02957f5278ef86c39592f8b9cb95239b0996358dc335274d947ae3f698b69: Status 404 returned error can't find the container with id c6c02957f5278ef86c39592f8b9cb95239b0996358dc335274d947ae3f698b69 Feb 02 13:27:08 crc kubenswrapper[4721]: I0202 13:27:08.265173 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:08 crc kubenswrapper[4721]: I0202 13:27:08.423880 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a836c9-b9f7-4991-9cb4-db6dce6f8e08" path="/var/lib/kubelet/pods/43a836c9-b9f7-4991-9cb4-db6dce6f8e08/volumes" Feb 02 13:27:09 crc kubenswrapper[4721]: I0202 13:27:09.012634 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-crdjt" event={"ID":"27df0911-fe79-4339-a6fe-cf538f97a247","Type":"ContainerStarted","Data":"38697017d92e58f9ce89dc861401173c6d0237bce2d2e6a9a0bc00c9d2093bfd"} Feb 02 13:27:09 crc kubenswrapper[4721]: I0202 13:27:09.012698 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-crdjt" event={"ID":"27df0911-fe79-4339-a6fe-cf538f97a247","Type":"ContainerStarted","Data":"4a54c161cdb3b885270001f046db0e6690c82344ac0228a96c3ecc48853db922"} Feb 02 13:27:09 crc kubenswrapper[4721]: I0202 13:27:09.015059 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c678f02b-cbee-4578-9e28-067b63af2682","Type":"ContainerStarted","Data":"649ef2ed3aa64d3d89d7aea442bebcd1359026e865b111008bd46d8dcb55496f"} Feb 02 13:27:09 crc kubenswrapper[4721]: I0202 13:27:09.017539 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"136a6410-a20a-4e6a-bbc4-aaa3634e8af7","Type":"ContainerStarted","Data":"bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b"} Feb 02 13:27:09 crc kubenswrapper[4721]: I0202 13:27:09.017576 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"136a6410-a20a-4e6a-bbc4-aaa3634e8af7","Type":"ContainerStarted","Data":"f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca"} Feb 02 13:27:09 crc kubenswrapper[4721]: I0202 13:27:09.017588 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"136a6410-a20a-4e6a-bbc4-aaa3634e8af7","Type":"ContainerStarted","Data":"c6c02957f5278ef86c39592f8b9cb95239b0996358dc335274d947ae3f698b69"} Feb 02 13:27:09 crc kubenswrapper[4721]: I0202 13:27:09.045563 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-crdjt" podStartSLOduration=3.045541065 podStartE2EDuration="3.045541065s" podCreationTimestamp="2026-02-02 13:27:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:27:09.026851738 +0000 UTC m=+1569.329366127" watchObservedRunningTime="2026-02-02 13:27:09.045541065 +0000 UTC m=+1569.348055454" Feb 02 13:27:09 crc kubenswrapper[4721]: I0202 13:27:09.081816 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.081795298 podStartE2EDuration="2.081795298s" podCreationTimestamp="2026-02-02 13:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:27:09.063749228 +0000 UTC m=+1569.366263637" watchObservedRunningTime="2026-02-02 13:27:09.081795298 +0000 UTC m=+1569.384309707" Feb 02 13:27:10 crc kubenswrapper[4721]: I0202 13:27:10.038579 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c678f02b-cbee-4578-9e28-067b63af2682","Type":"ContainerStarted","Data":"937d004a2c0e368a16d6d0ca670b5482820ecb8e33a8a090dda3618c8b3e7f90"} Feb 02 13:27:10 crc kubenswrapper[4721]: I0202 13:27:10.186219 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-fz8n5" Feb 02 13:27:10 crc kubenswrapper[4721]: I0202 13:27:10.268239 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pclnt"] Feb 02 13:27:10 crc kubenswrapper[4721]: I0202 13:27:10.268484 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" podUID="b87e6edb-4947-41a9-b95c-5120f9b4dbdc" containerName="dnsmasq-dns" containerID="cri-o://513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d" gracePeriod=10 Feb 02 13:27:10 crc kubenswrapper[4721]: E0202 13:27:10.376341 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb87e6edb_4947_41a9_b95c_5120f9b4dbdc.slice/crio-513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d.scope\": RecentStats: unable to find data in memory cache]" Feb 02 13:27:10 crc kubenswrapper[4721]: I0202 13:27:10.964198 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.035054 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-config\") pod \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.035111 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-svc\") pod \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.035188 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-nb\") pod \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.035226 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clh4b\" (UniqueName: \"kubernetes.io/projected/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-kube-api-access-clh4b\") pod \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.035272 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-swift-storage-0\") pod \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.035346 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-sb\") pod \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\" (UID: \"b87e6edb-4947-41a9-b95c-5120f9b4dbdc\") " Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.040570 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-kube-api-access-clh4b" (OuterVolumeSpecName: "kube-api-access-clh4b") pod "b87e6edb-4947-41a9-b95c-5120f9b4dbdc" (UID: "b87e6edb-4947-41a9-b95c-5120f9b4dbdc"). InnerVolumeSpecName "kube-api-access-clh4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.079697 4721 generic.go:334] "Generic (PLEG): container finished" podID="b87e6edb-4947-41a9-b95c-5120f9b4dbdc" containerID="513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d" exitCode=0 Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.079785 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.079805 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" event={"ID":"b87e6edb-4947-41a9-b95c-5120f9b4dbdc","Type":"ContainerDied","Data":"513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d"} Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.081127 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pclnt" event={"ID":"b87e6edb-4947-41a9-b95c-5120f9b4dbdc","Type":"ContainerDied","Data":"5099999a072997f647e2393c3c5cb7d2e07ed8a533074bc15ca483b0989cce5e"} Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.081152 4721 scope.go:117] "RemoveContainer" containerID="513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.117178 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b87e6edb-4947-41a9-b95c-5120f9b4dbdc" (UID: "b87e6edb-4947-41a9-b95c-5120f9b4dbdc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.135486 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b87e6edb-4947-41a9-b95c-5120f9b4dbdc" (UID: "b87e6edb-4947-41a9-b95c-5120f9b4dbdc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.140276 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clh4b\" (UniqueName: \"kubernetes.io/projected/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-kube-api-access-clh4b\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.141475 4721 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.141494 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.199981 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b87e6edb-4947-41a9-b95c-5120f9b4dbdc" (UID: "b87e6edb-4947-41a9-b95c-5120f9b4dbdc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.215590 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-config" (OuterVolumeSpecName: "config") pod "b87e6edb-4947-41a9-b95c-5120f9b4dbdc" (UID: "b87e6edb-4947-41a9-b95c-5120f9b4dbdc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.217053 4721 scope.go:117] "RemoveContainer" containerID="4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.244149 4721 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-config\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.244188 4721 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.249713 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b87e6edb-4947-41a9-b95c-5120f9b4dbdc" (UID: "b87e6edb-4947-41a9-b95c-5120f9b4dbdc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.261875 4721 scope.go:117] "RemoveContainer" containerID="513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d" Feb 02 13:27:11 crc kubenswrapper[4721]: E0202 13:27:11.262309 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d\": container with ID starting with 513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d not found: ID does not exist" containerID="513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.262352 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d"} err="failed to get container status \"513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d\": rpc error: code = NotFound desc = could not find container \"513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d\": container with ID starting with 513383a4b8dba8040ab714a52c634492b679dbd50ddd5850c311449d4d0f662d not found: ID does not exist" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.262382 4721 scope.go:117] "RemoveContainer" containerID="4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c" Feb 02 13:27:11 crc kubenswrapper[4721]: E0202 13:27:11.262888 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c\": container with ID starting with 4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c not found: ID does not exist" containerID="4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.262995 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c"} err="failed to get container status \"4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c\": rpc error: code = NotFound desc = could not find container \"4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c\": container with ID starting with 4daf317fd25f62ffeb9c6c1f2bdb1a7d756e3cbacfa5fcec5e77dfc48d7f394c not found: ID does not exist" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.345993 4721 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b87e6edb-4947-41a9-b95c-5120f9b4dbdc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.471468 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pclnt"] Feb 02 13:27:11 crc kubenswrapper[4721]: I0202 13:27:11.484980 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pclnt"] Feb 02 13:27:12 crc kubenswrapper[4721]: I0202 13:27:12.100855 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c678f02b-cbee-4578-9e28-067b63af2682","Type":"ContainerStarted","Data":"cd5a14c0f956517b7ab15b3a6ee8d26d1a69dbd8f7c7525fbe48b4f147cbbb30"} Feb 02 13:27:12 crc kubenswrapper[4721]: I0202 13:27:12.102879 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 13:27:12 crc kubenswrapper[4721]: I0202 13:27:12.127739 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.301826104 podStartE2EDuration="7.127721127s" podCreationTimestamp="2026-02-02 13:27:05 +0000 UTC" firstStartedPulling="2026-02-02 13:27:06.664839148 +0000 UTC m=+1566.967353537" lastFinishedPulling="2026-02-02 13:27:11.490734171 +0000 UTC m=+1571.793248560" observedRunningTime="2026-02-02 13:27:12.12342742 +0000 UTC m=+1572.425941819" watchObservedRunningTime="2026-02-02 13:27:12.127721127 +0000 UTC m=+1572.430235516" Feb 02 13:27:12 crc kubenswrapper[4721]: I0202 13:27:12.423524 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b87e6edb-4947-41a9-b95c-5120f9b4dbdc" path="/var/lib/kubelet/pods/b87e6edb-4947-41a9-b95c-5120f9b4dbdc/volumes" Feb 02 13:27:14 crc kubenswrapper[4721]: I0202 13:27:14.136332 4721 generic.go:334] "Generic (PLEG): container finished" podID="27df0911-fe79-4339-a6fe-cf538f97a247" containerID="38697017d92e58f9ce89dc861401173c6d0237bce2d2e6a9a0bc00c9d2093bfd" exitCode=0 Feb 02 13:27:14 crc kubenswrapper[4721]: I0202 13:27:14.136387 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-crdjt" event={"ID":"27df0911-fe79-4339-a6fe-cf538f97a247","Type":"ContainerDied","Data":"38697017d92e58f9ce89dc861401173c6d0237bce2d2e6a9a0bc00c9d2093bfd"} Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.624681 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.764784 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-scripts\") pod \"27df0911-fe79-4339-a6fe-cf538f97a247\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.765115 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-config-data\") pod \"27df0911-fe79-4339-a6fe-cf538f97a247\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.765239 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6svz\" (UniqueName: \"kubernetes.io/projected/27df0911-fe79-4339-a6fe-cf538f97a247-kube-api-access-m6svz\") pod \"27df0911-fe79-4339-a6fe-cf538f97a247\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.765565 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-combined-ca-bundle\") pod \"27df0911-fe79-4339-a6fe-cf538f97a247\" (UID: \"27df0911-fe79-4339-a6fe-cf538f97a247\") " Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.772592 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-scripts" (OuterVolumeSpecName: "scripts") pod "27df0911-fe79-4339-a6fe-cf538f97a247" (UID: "27df0911-fe79-4339-a6fe-cf538f97a247"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.772960 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27df0911-fe79-4339-a6fe-cf538f97a247-kube-api-access-m6svz" (OuterVolumeSpecName: "kube-api-access-m6svz") pod "27df0911-fe79-4339-a6fe-cf538f97a247" (UID: "27df0911-fe79-4339-a6fe-cf538f97a247"). InnerVolumeSpecName "kube-api-access-m6svz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.798531 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-config-data" (OuterVolumeSpecName: "config-data") pod "27df0911-fe79-4339-a6fe-cf538f97a247" (UID: "27df0911-fe79-4339-a6fe-cf538f97a247"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.799968 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27df0911-fe79-4339-a6fe-cf538f97a247" (UID: "27df0911-fe79-4339-a6fe-cf538f97a247"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.868489 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.868772 4721 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.868780 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27df0911-fe79-4339-a6fe-cf538f97a247-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:15 crc kubenswrapper[4721]: I0202 13:27:15.868791 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6svz\" (UniqueName: \"kubernetes.io/projected/27df0911-fe79-4339-a6fe-cf538f97a247-kube-api-access-m6svz\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.162060 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-crdjt" event={"ID":"27df0911-fe79-4339-a6fe-cf538f97a247","Type":"ContainerDied","Data":"4a54c161cdb3b885270001f046db0e6690c82344ac0228a96c3ecc48853db922"} Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.162117 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a54c161cdb3b885270001f046db0e6690c82344ac0228a96c3ecc48853db922" Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.162200 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-crdjt" Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.347417 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.347696 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1f591d46-b7ce-4767-987a-bcdaa2f6d3b1" containerName="nova-scheduler-scheduler" containerID="cri-o://5acb431b8e0eb7677b98d89ff66f186391a5d39e1b675d018fe41d3b90c905cd" gracePeriod=30 Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.361096 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.361933 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerName="nova-api-api" containerID="cri-o://bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b" gracePeriod=30 Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.362108 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerName="nova-api-log" containerID="cri-o://f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca" gracePeriod=30 Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.430235 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.431194 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-log" containerID="cri-o://4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229" gracePeriod=30 Feb 02 13:27:16 crc kubenswrapper[4721]: I0202 13:27:16.431282 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-metadata" containerID="cri-o://74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203" gracePeriod=30 Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.139623 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.175470 4721 generic.go:334] "Generic (PLEG): container finished" podID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerID="bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b" exitCode=0 Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.175779 4721 generic.go:334] "Generic (PLEG): container finished" podID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerID="f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca" exitCode=143 Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.175543 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.175549 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"136a6410-a20a-4e6a-bbc4-aaa3634e8af7","Type":"ContainerDied","Data":"bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b"} Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.177130 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"136a6410-a20a-4e6a-bbc4-aaa3634e8af7","Type":"ContainerDied","Data":"f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca"} Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.177149 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"136a6410-a20a-4e6a-bbc4-aaa3634e8af7","Type":"ContainerDied","Data":"c6c02957f5278ef86c39592f8b9cb95239b0996358dc335274d947ae3f698b69"} Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.177169 4721 scope.go:117] "RemoveContainer" containerID="bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.179694 4721 generic.go:334] "Generic (PLEG): container finished" podID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerID="4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229" exitCode=143 Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.179781 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfc7ae7-2e8c-4696-a72e-7308794bf726","Type":"ContainerDied","Data":"4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229"} Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.202718 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-public-tls-certs\") pod \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.202760 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-internal-tls-certs\") pod \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.202828 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-combined-ca-bundle\") pod \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.202861 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rww68\" (UniqueName: \"kubernetes.io/projected/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-kube-api-access-rww68\") pod \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.202938 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-logs\") pod \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.203034 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-config-data\") pod \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\" (UID: \"136a6410-a20a-4e6a-bbc4-aaa3634e8af7\") " Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.204570 4721 scope.go:117] "RemoveContainer" containerID="f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.206555 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-logs" (OuterVolumeSpecName: "logs") pod "136a6410-a20a-4e6a-bbc4-aaa3634e8af7" (UID: "136a6410-a20a-4e6a-bbc4-aaa3634e8af7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.209706 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-kube-api-access-rww68" (OuterVolumeSpecName: "kube-api-access-rww68") pod "136a6410-a20a-4e6a-bbc4-aaa3634e8af7" (UID: "136a6410-a20a-4e6a-bbc4-aaa3634e8af7"). InnerVolumeSpecName "kube-api-access-rww68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.242272 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-config-data" (OuterVolumeSpecName: "config-data") pod "136a6410-a20a-4e6a-bbc4-aaa3634e8af7" (UID: "136a6410-a20a-4e6a-bbc4-aaa3634e8af7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.242412 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "136a6410-a20a-4e6a-bbc4-aaa3634e8af7" (UID: "136a6410-a20a-4e6a-bbc4-aaa3634e8af7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.291248 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "136a6410-a20a-4e6a-bbc4-aaa3634e8af7" (UID: "136a6410-a20a-4e6a-bbc4-aaa3634e8af7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.295330 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "136a6410-a20a-4e6a-bbc4-aaa3634e8af7" (UID: "136a6410-a20a-4e6a-bbc4-aaa3634e8af7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.302102 4721 scope.go:117] "RemoveContainer" containerID="bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b" Feb 02 13:27:17 crc kubenswrapper[4721]: E0202 13:27:17.302590 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b\": container with ID starting with bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b not found: ID does not exist" containerID="bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.302636 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b"} err="failed to get container status \"bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b\": rpc error: code = NotFound desc = could not find container \"bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b\": container with ID starting with bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b not found: ID does not exist" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.302665 4721 scope.go:117] "RemoveContainer" containerID="f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca" Feb 02 13:27:17 crc kubenswrapper[4721]: E0202 13:27:17.303014 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca\": container with ID starting with f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca not found: ID does not exist" containerID="f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.303058 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca"} err="failed to get container status \"f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca\": rpc error: code = NotFound desc = could not find container \"f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca\": container with ID starting with f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca not found: ID does not exist" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.303099 4721 scope.go:117] "RemoveContainer" containerID="bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.303403 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b"} err="failed to get container status \"bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b\": rpc error: code = NotFound desc = could not find container \"bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b\": container with ID starting with bf21873a2e828ded4b708082cfc1dfc4d06001ee10f1de31d95e3013760a815b not found: ID does not exist" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.303431 4721 scope.go:117] "RemoveContainer" containerID="f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.303758 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca"} err="failed to get container status \"f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca\": rpc error: code = NotFound desc = could not find container \"f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca\": container with ID starting with f756020708f6c293792ee758adf5bf7623dfce1b04f53de592987961836aa5ca not found: ID does not exist" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.305330 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.305353 4721 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.305363 4721 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.305374 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.305384 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rww68\" (UniqueName: \"kubernetes.io/projected/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-kube-api-access-rww68\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.305393 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/136a6410-a20a-4e6a-bbc4-aaa3634e8af7-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.516470 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.531045 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.541408 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:17 crc kubenswrapper[4721]: E0202 13:27:17.541840 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87e6edb-4947-41a9-b95c-5120f9b4dbdc" containerName="init" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.541856 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87e6edb-4947-41a9-b95c-5120f9b4dbdc" containerName="init" Feb 02 13:27:17 crc kubenswrapper[4721]: E0202 13:27:17.541872 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerName="nova-api-log" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.541878 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerName="nova-api-log" Feb 02 13:27:17 crc kubenswrapper[4721]: E0202 13:27:17.541888 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27df0911-fe79-4339-a6fe-cf538f97a247" containerName="nova-manage" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.541893 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="27df0911-fe79-4339-a6fe-cf538f97a247" containerName="nova-manage" Feb 02 13:27:17 crc kubenswrapper[4721]: E0202 13:27:17.541917 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerName="nova-api-api" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.541922 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerName="nova-api-api" Feb 02 13:27:17 crc kubenswrapper[4721]: E0202 13:27:17.541947 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b87e6edb-4947-41a9-b95c-5120f9b4dbdc" containerName="dnsmasq-dns" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.541953 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b87e6edb-4947-41a9-b95c-5120f9b4dbdc" containerName="dnsmasq-dns" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.542282 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b87e6edb-4947-41a9-b95c-5120f9b4dbdc" containerName="dnsmasq-dns" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.542324 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerName="nova-api-log" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.542335 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="27df0911-fe79-4339-a6fe-cf538f97a247" containerName="nova-manage" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.542344 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" containerName="nova-api-api" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.544968 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.548913 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.549179 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.549504 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.561219 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.614279 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-config-data\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.614328 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.614355 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.614431 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ljpx\" (UniqueName: \"kubernetes.io/projected/8eccca7c-e269-4ecc-9fce-024196f66aaa-kube-api-access-6ljpx\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.614496 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eccca7c-e269-4ecc-9fce-024196f66aaa-logs\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.614563 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-public-tls-certs\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.717051 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eccca7c-e269-4ecc-9fce-024196f66aaa-logs\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.717165 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-public-tls-certs\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.717266 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-config-data\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.717289 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.717313 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.717369 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ljpx\" (UniqueName: \"kubernetes.io/projected/8eccca7c-e269-4ecc-9fce-024196f66aaa-kube-api-access-6ljpx\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.718236 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eccca7c-e269-4ecc-9fce-024196f66aaa-logs\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.721188 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-config-data\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.721335 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-public-tls-certs\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.726714 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.729560 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8eccca7c-e269-4ecc-9fce-024196f66aaa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.738916 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ljpx\" (UniqueName: \"kubernetes.io/projected/8eccca7c-e269-4ecc-9fce-024196f66aaa-kube-api-access-6ljpx\") pod \"nova-api-0\" (UID: \"8eccca7c-e269-4ecc-9fce-024196f66aaa\") " pod="openstack/nova-api-0" Feb 02 13:27:17 crc kubenswrapper[4721]: I0202 13:27:17.928888 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 13:27:18 crc kubenswrapper[4721]: W0202 13:27:18.382388 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eccca7c_e269_4ecc_9fce_024196f66aaa.slice/crio-1ee255869e6f59cd6cda7d9aa89fb630c94ff69e7e80d9de3b97442d6b610f4e WatchSource:0}: Error finding container 1ee255869e6f59cd6cda7d9aa89fb630c94ff69e7e80d9de3b97442d6b610f4e: Status 404 returned error can't find the container with id 1ee255869e6f59cd6cda7d9aa89fb630c94ff69e7e80d9de3b97442d6b610f4e Feb 02 13:27:18 crc kubenswrapper[4721]: I0202 13:27:18.385497 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 13:27:18 crc kubenswrapper[4721]: I0202 13:27:18.427893 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="136a6410-a20a-4e6a-bbc4-aaa3634e8af7" path="/var/lib/kubelet/pods/136a6410-a20a-4e6a-bbc4-aaa3634e8af7/volumes" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.233817 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eccca7c-e269-4ecc-9fce-024196f66aaa","Type":"ContainerStarted","Data":"3648d6846951dd5e0f3934169f96933d009589baa5c362fe89bdaa232213073c"} Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.234468 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eccca7c-e269-4ecc-9fce-024196f66aaa","Type":"ContainerStarted","Data":"b521f058d8fe8ab6771dfc8a3e266876588ba640ad8dda8bf97332321154d3c5"} Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.234481 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8eccca7c-e269-4ecc-9fce-024196f66aaa","Type":"ContainerStarted","Data":"1ee255869e6f59cd6cda7d9aa89fb630c94ff69e7e80d9de3b97442d6b610f4e"} Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.236148 4721 generic.go:334] "Generic (PLEG): container finished" podID="1f591d46-b7ce-4767-987a-bcdaa2f6d3b1" containerID="5acb431b8e0eb7677b98d89ff66f186391a5d39e1b675d018fe41d3b90c905cd" exitCode=0 Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.236185 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1","Type":"ContainerDied","Data":"5acb431b8e0eb7677b98d89ff66f186391a5d39e1b675d018fe41d3b90c905cd"} Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.255242 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.255225421 podStartE2EDuration="2.255225421s" podCreationTimestamp="2026-02-02 13:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:27:19.253881824 +0000 UTC m=+1579.556396213" watchObservedRunningTime="2026-02-02 13:27:19.255225421 +0000 UTC m=+1579.557739810" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.383038 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.557230 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb5wd\" (UniqueName: \"kubernetes.io/projected/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-kube-api-access-zb5wd\") pod \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.557492 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-combined-ca-bundle\") pod \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.557526 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-config-data\") pod \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\" (UID: \"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1\") " Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.563895 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-kube-api-access-zb5wd" (OuterVolumeSpecName: "kube-api-access-zb5wd") pod "1f591d46-b7ce-4767-987a-bcdaa2f6d3b1" (UID: "1f591d46-b7ce-4767-987a-bcdaa2f6d3b1"). InnerVolumeSpecName "kube-api-access-zb5wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.572124 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": read tcp 10.217.0.2:47056->10.217.0.254:8775: read: connection reset by peer" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.572167 4721 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.254:8775/\": read tcp 10.217.0.2:47060->10.217.0.254:8775: read: connection reset by peer" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.595015 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f591d46-b7ce-4767-987a-bcdaa2f6d3b1" (UID: "1f591d46-b7ce-4767-987a-bcdaa2f6d3b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.597157 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-config-data" (OuterVolumeSpecName: "config-data") pod "1f591d46-b7ce-4767-987a-bcdaa2f6d3b1" (UID: "1f591d46-b7ce-4767-987a-bcdaa2f6d3b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.660849 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.660985 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:19 crc kubenswrapper[4721]: I0202 13:27:19.661115 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb5wd\" (UniqueName: \"kubernetes.io/projected/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1-kube-api-access-zb5wd\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.065677 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.170778 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-combined-ca-bundle\") pod \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.170997 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfc7ae7-2e8c-4696-a72e-7308794bf726-logs\") pod \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.171036 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-config-data\") pod \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.171574 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbfc7ae7-2e8c-4696-a72e-7308794bf726-logs" (OuterVolumeSpecName: "logs") pod "fbfc7ae7-2e8c-4696-a72e-7308794bf726" (UID: "fbfc7ae7-2e8c-4696-a72e-7308794bf726"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.173617 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t22jm\" (UniqueName: \"kubernetes.io/projected/fbfc7ae7-2e8c-4696-a72e-7308794bf726-kube-api-access-t22jm\") pod \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.174100 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-nova-metadata-tls-certs\") pod \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\" (UID: \"fbfc7ae7-2e8c-4696-a72e-7308794bf726\") " Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.174957 4721 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbfc7ae7-2e8c-4696-a72e-7308794bf726-logs\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.177684 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbfc7ae7-2e8c-4696-a72e-7308794bf726-kube-api-access-t22jm" (OuterVolumeSpecName: "kube-api-access-t22jm") pod "fbfc7ae7-2e8c-4696-a72e-7308794bf726" (UID: "fbfc7ae7-2e8c-4696-a72e-7308794bf726"). InnerVolumeSpecName "kube-api-access-t22jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.205237 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-config-data" (OuterVolumeSpecName: "config-data") pod "fbfc7ae7-2e8c-4696-a72e-7308794bf726" (UID: "fbfc7ae7-2e8c-4696-a72e-7308794bf726"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.224954 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbfc7ae7-2e8c-4696-a72e-7308794bf726" (UID: "fbfc7ae7-2e8c-4696-a72e-7308794bf726"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.249481 4721 generic.go:334] "Generic (PLEG): container finished" podID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerID="74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203" exitCode=0 Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.249554 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfc7ae7-2e8c-4696-a72e-7308794bf726","Type":"ContainerDied","Data":"74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203"} Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.249584 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fbfc7ae7-2e8c-4696-a72e-7308794bf726","Type":"ContainerDied","Data":"47ce14db062af36ae7fdbf425398b0057cb5127414ac2231235ece255a18295a"} Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.249602 4721 scope.go:117] "RemoveContainer" containerID="74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.249655 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.252803 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fbfc7ae7-2e8c-4696-a72e-7308794bf726" (UID: "fbfc7ae7-2e8c-4696-a72e-7308794bf726"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.254168 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.254220 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1f591d46-b7ce-4767-987a-bcdaa2f6d3b1","Type":"ContainerDied","Data":"069121de18c3f33f1304829d8169978a460562815bd16049d235be352ef578bb"} Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.283129 4721 scope.go:117] "RemoveContainer" containerID="4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.283163 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t22jm\" (UniqueName: \"kubernetes.io/projected/fbfc7ae7-2e8c-4696-a72e-7308794bf726-kube-api-access-t22jm\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.283210 4721 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.283231 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.283253 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbfc7ae7-2e8c-4696-a72e-7308794bf726-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.344271 4721 scope.go:117] "RemoveContainer" containerID="74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203" Feb 02 13:27:20 crc kubenswrapper[4721]: E0202 13:27:20.344835 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203\": container with ID starting with 74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203 not found: ID does not exist" containerID="74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.345015 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203"} err="failed to get container status \"74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203\": rpc error: code = NotFound desc = could not find container \"74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203\": container with ID starting with 74967a457dae02a2ce8a603739bc30e8cdd7da978739f51561117554fb3ba203 not found: ID does not exist" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.345248 4721 scope.go:117] "RemoveContainer" containerID="4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229" Feb 02 13:27:20 crc kubenswrapper[4721]: E0202 13:27:20.346295 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229\": container with ID starting with 4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229 not found: ID does not exist" containerID="4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.346332 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229"} err="failed to get container status \"4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229\": rpc error: code = NotFound desc = could not find container \"4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229\": container with ID starting with 4fd6cf6542770e67b596179be2f1fb73bfe89b859a5b9f3f4406655d59a15229 not found: ID does not exist" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.346357 4721 scope.go:117] "RemoveContainer" containerID="5acb431b8e0eb7677b98d89ff66f186391a5d39e1b675d018fe41d3b90c905cd" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.357603 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.384311 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.396210 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:27:20 crc kubenswrapper[4721]: E0202 13:27:20.396862 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-metadata" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.396891 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-metadata" Feb 02 13:27:20 crc kubenswrapper[4721]: E0202 13:27:20.396923 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-log" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.396931 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-log" Feb 02 13:27:20 crc kubenswrapper[4721]: E0202 13:27:20.396965 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f591d46-b7ce-4767-987a-bcdaa2f6d3b1" containerName="nova-scheduler-scheduler" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.396973 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f591d46-b7ce-4767-987a-bcdaa2f6d3b1" containerName="nova-scheduler-scheduler" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.397281 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-log" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.397302 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f591d46-b7ce-4767-987a-bcdaa2f6d3b1" containerName="nova-scheduler-scheduler" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.397315 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" containerName="nova-metadata-metadata" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.398320 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.401574 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.408388 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.426703 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f591d46-b7ce-4767-987a-bcdaa2f6d3b1" path="/var/lib/kubelet/pods/1f591d46-b7ce-4767-987a-bcdaa2f6d3b1/volumes" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.580590 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.590450 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kkz2\" (UniqueName: \"kubernetes.io/projected/d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728-kube-api-access-6kkz2\") pod \"nova-scheduler-0\" (UID: \"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728\") " pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.590549 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728-config-data\") pod \"nova-scheduler-0\" (UID: \"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728\") " pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.591108 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728\") " pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.606769 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.618276 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.620531 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.623630 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.623706 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.628792 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.693732 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kkz2\" (UniqueName: \"kubernetes.io/projected/d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728-kube-api-access-6kkz2\") pod \"nova-scheduler-0\" (UID: \"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728\") " pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.693807 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728-config-data\") pod \"nova-scheduler-0\" (UID: \"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728\") " pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.693948 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728\") " pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.699816 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728\") " pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.704827 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728-config-data\") pod \"nova-scheduler-0\" (UID: \"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728\") " pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.709045 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kkz2\" (UniqueName: \"kubernetes.io/projected/d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728-kube-api-access-6kkz2\") pod \"nova-scheduler-0\" (UID: \"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728\") " pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.720397 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.796307 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc4rm\" (UniqueName: \"kubernetes.io/projected/f6e14b26-cab3-4acd-aad2-8cda004e0282-kube-api-access-tc4rm\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.796360 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6e14b26-cab3-4acd-aad2-8cda004e0282-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.796585 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6e14b26-cab3-4acd-aad2-8cda004e0282-logs\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.796994 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6e14b26-cab3-4acd-aad2-8cda004e0282-config-data\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.797133 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6e14b26-cab3-4acd-aad2-8cda004e0282-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.899859 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6e14b26-cab3-4acd-aad2-8cda004e0282-config-data\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.900030 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6e14b26-cab3-4acd-aad2-8cda004e0282-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.900152 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc4rm\" (UniqueName: \"kubernetes.io/projected/f6e14b26-cab3-4acd-aad2-8cda004e0282-kube-api-access-tc4rm\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.900195 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6e14b26-cab3-4acd-aad2-8cda004e0282-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.900338 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6e14b26-cab3-4acd-aad2-8cda004e0282-logs\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.900897 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6e14b26-cab3-4acd-aad2-8cda004e0282-logs\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.904345 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6e14b26-cab3-4acd-aad2-8cda004e0282-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.904973 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6e14b26-cab3-4acd-aad2-8cda004e0282-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.914306 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6e14b26-cab3-4acd-aad2-8cda004e0282-config-data\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.916937 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc4rm\" (UniqueName: \"kubernetes.io/projected/f6e14b26-cab3-4acd-aad2-8cda004e0282-kube-api-access-tc4rm\") pod \"nova-metadata-0\" (UID: \"f6e14b26-cab3-4acd-aad2-8cda004e0282\") " pod="openstack/nova-metadata-0" Feb 02 13:27:20 crc kubenswrapper[4721]: I0202 13:27:20.987188 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 13:27:21 crc kubenswrapper[4721]: I0202 13:27:21.192367 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 13:27:21 crc kubenswrapper[4721]: W0202 13:27:21.200909 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3e7848b_5b3e_4e6b_8c5e_82cd9f2f7728.slice/crio-d8b4dda5ad7a0b3d40cae6755ed625204fa40c0f257aee261f6c7ea873cd124d WatchSource:0}: Error finding container d8b4dda5ad7a0b3d40cae6755ed625204fa40c0f257aee261f6c7ea873cd124d: Status 404 returned error can't find the container with id d8b4dda5ad7a0b3d40cae6755ed625204fa40c0f257aee261f6c7ea873cd124d Feb 02 13:27:21 crc kubenswrapper[4721]: I0202 13:27:21.272513 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728","Type":"ContainerStarted","Data":"d8b4dda5ad7a0b3d40cae6755ed625204fa40c0f257aee261f6c7ea873cd124d"} Feb 02 13:27:21 crc kubenswrapper[4721]: I0202 13:27:21.439793 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 13:27:21 crc kubenswrapper[4721]: W0202 13:27:21.442401 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6e14b26_cab3_4acd_aad2_8cda004e0282.slice/crio-a271e0b31b8b5aa1808fbdab19d01a670541e51a63b6f31c199fe2ab6574f8a3 WatchSource:0}: Error finding container a271e0b31b8b5aa1808fbdab19d01a670541e51a63b6f31c199fe2ab6574f8a3: Status 404 returned error can't find the container with id a271e0b31b8b5aa1808fbdab19d01a670541e51a63b6f31c199fe2ab6574f8a3 Feb 02 13:27:22 crc kubenswrapper[4721]: I0202 13:27:22.286427 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6e14b26-cab3-4acd-aad2-8cda004e0282","Type":"ContainerStarted","Data":"ce6a8662a0b8ecab16749cc46d28ed7b9c5b6b3bc69fc300341f3c0fd2e5c384"} Feb 02 13:27:22 crc kubenswrapper[4721]: I0202 13:27:22.286811 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6e14b26-cab3-4acd-aad2-8cda004e0282","Type":"ContainerStarted","Data":"7954bcef1e04f6e445d6281716b387227fbc3e2114d3e27b09b8602be4abf5d1"} Feb 02 13:27:22 crc kubenswrapper[4721]: I0202 13:27:22.286868 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6e14b26-cab3-4acd-aad2-8cda004e0282","Type":"ContainerStarted","Data":"a271e0b31b8b5aa1808fbdab19d01a670541e51a63b6f31c199fe2ab6574f8a3"} Feb 02 13:27:22 crc kubenswrapper[4721]: I0202 13:27:22.288286 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728","Type":"ContainerStarted","Data":"859cdc3d39c0363ca2f661c1cf1914488aa73c81c8570620643ccebc9f320a8f"} Feb 02 13:27:22 crc kubenswrapper[4721]: I0202 13:27:22.310543 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.310518963 podStartE2EDuration="2.310518963s" podCreationTimestamp="2026-02-02 13:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:27:22.308040076 +0000 UTC m=+1582.610554465" watchObservedRunningTime="2026-02-02 13:27:22.310518963 +0000 UTC m=+1582.613033352" Feb 02 13:27:22 crc kubenswrapper[4721]: I0202 13:27:22.336675 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.336653332 podStartE2EDuration="2.336653332s" podCreationTimestamp="2026-02-02 13:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:27:22.328575843 +0000 UTC m=+1582.631090232" watchObservedRunningTime="2026-02-02 13:27:22.336653332 +0000 UTC m=+1582.639167721" Feb 02 13:27:22 crc kubenswrapper[4721]: I0202 13:27:22.424545 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbfc7ae7-2e8c-4696-a72e-7308794bf726" path="/var/lib/kubelet/pods/fbfc7ae7-2e8c-4696-a72e-7308794bf726/volumes" Feb 02 13:27:25 crc kubenswrapper[4721]: I0202 13:27:25.721745 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 13:27:25 crc kubenswrapper[4721]: I0202 13:27:25.988443 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 13:27:25 crc kubenswrapper[4721]: I0202 13:27:25.988909 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 13:27:27 crc kubenswrapper[4721]: I0202 13:27:27.929845 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 13:27:27 crc kubenswrapper[4721]: I0202 13:27:27.930196 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 13:27:28 crc kubenswrapper[4721]: I0202 13:27:28.942215 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eccca7c-e269-4ecc-9fce-024196f66aaa" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.9:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:27:28 crc kubenswrapper[4721]: I0202 13:27:28.942230 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8eccca7c-e269-4ecc-9fce-024196f66aaa" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.9:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:27:30 crc kubenswrapper[4721]: I0202 13:27:30.721446 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 13:27:30 crc kubenswrapper[4721]: I0202 13:27:30.758452 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 13:27:30 crc kubenswrapper[4721]: I0202 13:27:30.990793 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 13:27:30 crc kubenswrapper[4721]: I0202 13:27:30.991482 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 13:27:31 crc kubenswrapper[4721]: I0202 13:27:31.447668 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 13:27:32 crc kubenswrapper[4721]: I0202 13:27:32.007275 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f6e14b26-cab3-4acd-aad2-8cda004e0282" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.11:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:27:32 crc kubenswrapper[4721]: I0202 13:27:32.007292 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f6e14b26-cab3-4acd-aad2-8cda004e0282" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.11:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 13:27:36 crc kubenswrapper[4721]: I0202 13:27:36.218034 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 13:27:37 crc kubenswrapper[4721]: I0202 13:27:37.946710 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 13:27:37 crc kubenswrapper[4721]: I0202 13:27:37.949513 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 13:27:37 crc kubenswrapper[4721]: I0202 13:27:37.953332 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 13:27:37 crc kubenswrapper[4721]: I0202 13:27:37.959472 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 13:27:38 crc kubenswrapper[4721]: I0202 13:27:38.515179 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 13:27:38 crc kubenswrapper[4721]: I0202 13:27:38.521446 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 13:27:40 crc kubenswrapper[4721]: I0202 13:27:40.997636 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 13:27:40 crc kubenswrapper[4721]: I0202 13:27:40.999113 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 13:27:41 crc kubenswrapper[4721]: I0202 13:27:41.003935 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 13:27:41 crc kubenswrapper[4721]: I0202 13:27:41.552204 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 13:28:14 crc kubenswrapper[4721]: I0202 13:28:14.763464 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:28:14 crc kubenswrapper[4721]: I0202 13:28:14.764112 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:28:44 crc kubenswrapper[4721]: I0202 13:28:44.763195 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:28:44 crc kubenswrapper[4721]: I0202 13:28:44.763755 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:29:09 crc kubenswrapper[4721]: I0202 13:29:09.949253 4721 scope.go:117] "RemoveContainer" containerID="6404bb6f1951c91af9f8453984cfaf480f6307bbda26d7f2b85d0b4cee4e2109" Feb 02 13:29:09 crc kubenswrapper[4721]: I0202 13:29:09.985516 4721 scope.go:117] "RemoveContainer" containerID="80fd8d4c523d759364a503f4e0957e812daf0265c52b4e63f92c47e96ac7e275" Feb 02 13:29:10 crc kubenswrapper[4721]: I0202 13:29:10.016378 4721 scope.go:117] "RemoveContainer" containerID="a4049e92c383c0eb65178e6eb81956222b0a85112475c87886800360320c1322" Feb 02 13:29:10 crc kubenswrapper[4721]: I0202 13:29:10.106502 4721 scope.go:117] "RemoveContainer" containerID="ebcd143f9cd75602d2879409dcec3b4694439187ff4c0fda35cb07bd211f9634" Feb 02 13:29:10 crc kubenswrapper[4721]: I0202 13:29:10.182316 4721 scope.go:117] "RemoveContainer" containerID="53b5d624776f4223952c5574dd921abb5bb1d5c538eeb2e730f20d491cd8ec2a" Feb 02 13:29:10 crc kubenswrapper[4721]: I0202 13:29:10.260568 4721 scope.go:117] "RemoveContainer" containerID="d0477d4beedf8835ceccc8981c1de2a9fe8aa3519682eb80e7972c0762297343" Feb 02 13:29:10 crc kubenswrapper[4721]: I0202 13:29:10.293365 4721 scope.go:117] "RemoveContainer" containerID="c8ac03b5a6a963dc432f18a1012252ac15cbdaf5a852eb90b3130207aa267b95" Feb 02 13:29:10 crc kubenswrapper[4721]: I0202 13:29:10.347853 4721 scope.go:117] "RemoveContainer" containerID="74a99b13280ba5e058fb97f392a9a2baa22e1224fb962f08950b59d7a1606135" Feb 02 13:29:14 crc kubenswrapper[4721]: I0202 13:29:14.763473 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:29:14 crc kubenswrapper[4721]: I0202 13:29:14.764148 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:29:14 crc kubenswrapper[4721]: I0202 13:29:14.764204 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:29:14 crc kubenswrapper[4721]: I0202 13:29:14.765158 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:29:14 crc kubenswrapper[4721]: I0202 13:29:14.765206 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" gracePeriod=600 Feb 02 13:29:14 crc kubenswrapper[4721]: E0202 13:29:14.892561 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:29:15 crc kubenswrapper[4721]: I0202 13:29:15.634435 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" exitCode=0 Feb 02 13:29:15 crc kubenswrapper[4721]: I0202 13:29:15.634496 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c"} Feb 02 13:29:15 crc kubenswrapper[4721]: I0202 13:29:15.634823 4721 scope.go:117] "RemoveContainer" containerID="4c89d3977af7fbb3779c0661dadea0111ac2d8f3c3974c534b682ad6a4af4aac" Feb 02 13:29:15 crc kubenswrapper[4721]: I0202 13:29:15.635726 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:29:15 crc kubenswrapper[4721]: E0202 13:29:15.636286 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:29:27 crc kubenswrapper[4721]: I0202 13:29:27.410019 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:29:27 crc kubenswrapper[4721]: E0202 13:29:27.410796 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:29:38 crc kubenswrapper[4721]: I0202 13:29:38.412583 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:29:38 crc kubenswrapper[4721]: E0202 13:29:38.416000 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:29:51 crc kubenswrapper[4721]: I0202 13:29:51.409908 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:29:51 crc kubenswrapper[4721]: E0202 13:29:51.410638 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.169269 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl"] Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.171454 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.173741 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.173891 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.181951 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl"] Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.270951 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b4436-4c9b-4671-acef-1ba5685cb660-secret-volume\") pod \"collect-profiles-29500650-fclwl\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.271068 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b4436-4c9b-4671-acef-1ba5685cb660-config-volume\") pod \"collect-profiles-29500650-fclwl\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.271147 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk5jt\" (UniqueName: \"kubernetes.io/projected/d19b4436-4c9b-4671-acef-1ba5685cb660-kube-api-access-zk5jt\") pod \"collect-profiles-29500650-fclwl\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.373588 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b4436-4c9b-4671-acef-1ba5685cb660-secret-volume\") pod \"collect-profiles-29500650-fclwl\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.373966 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b4436-4c9b-4671-acef-1ba5685cb660-config-volume\") pod \"collect-profiles-29500650-fclwl\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.374139 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk5jt\" (UniqueName: \"kubernetes.io/projected/d19b4436-4c9b-4671-acef-1ba5685cb660-kube-api-access-zk5jt\") pod \"collect-profiles-29500650-fclwl\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.375073 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b4436-4c9b-4671-acef-1ba5685cb660-config-volume\") pod \"collect-profiles-29500650-fclwl\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.379421 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b4436-4c9b-4671-acef-1ba5685cb660-secret-volume\") pod \"collect-profiles-29500650-fclwl\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.396569 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk5jt\" (UniqueName: \"kubernetes.io/projected/d19b4436-4c9b-4671-acef-1ba5685cb660-kube-api-access-zk5jt\") pod \"collect-profiles-29500650-fclwl\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.547442 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 13:30:00 crc kubenswrapper[4721]: I0202 13:30:00.555685 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:01 crc kubenswrapper[4721]: I0202 13:30:01.034913 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl"] Feb 02 13:30:01 crc kubenswrapper[4721]: W0202 13:30:01.037025 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd19b4436_4c9b_4671_acef_1ba5685cb660.slice/crio-34a0f251bb9e78b0bfa37fe20d6c84f3dbcc2bc274affa0b31a69e126f8a70a5 WatchSource:0}: Error finding container 34a0f251bb9e78b0bfa37fe20d6c84f3dbcc2bc274affa0b31a69e126f8a70a5: Status 404 returned error can't find the container with id 34a0f251bb9e78b0bfa37fe20d6c84f3dbcc2bc274affa0b31a69e126f8a70a5 Feb 02 13:30:01 crc kubenswrapper[4721]: I0202 13:30:01.184017 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" event={"ID":"d19b4436-4c9b-4671-acef-1ba5685cb660","Type":"ContainerStarted","Data":"34a0f251bb9e78b0bfa37fe20d6c84f3dbcc2bc274affa0b31a69e126f8a70a5"} Feb 02 13:30:02 crc kubenswrapper[4721]: I0202 13:30:02.199546 4721 generic.go:334] "Generic (PLEG): container finished" podID="d19b4436-4c9b-4671-acef-1ba5685cb660" containerID="5d8115a3c44a297e5941de9c7ae62ed0d1533603d2bcff7cfc2aadd64924c9b1" exitCode=0 Feb 02 13:30:02 crc kubenswrapper[4721]: I0202 13:30:02.199654 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" event={"ID":"d19b4436-4c9b-4671-acef-1ba5685cb660","Type":"ContainerDied","Data":"5d8115a3c44a297e5941de9c7ae62ed0d1533603d2bcff7cfc2aadd64924c9b1"} Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.599597 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.652878 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk5jt\" (UniqueName: \"kubernetes.io/projected/d19b4436-4c9b-4671-acef-1ba5685cb660-kube-api-access-zk5jt\") pod \"d19b4436-4c9b-4671-acef-1ba5685cb660\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.653156 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b4436-4c9b-4671-acef-1ba5685cb660-secret-volume\") pod \"d19b4436-4c9b-4671-acef-1ba5685cb660\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.653314 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b4436-4c9b-4671-acef-1ba5685cb660-config-volume\") pod \"d19b4436-4c9b-4671-acef-1ba5685cb660\" (UID: \"d19b4436-4c9b-4671-acef-1ba5685cb660\") " Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.654618 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19b4436-4c9b-4671-acef-1ba5685cb660-config-volume" (OuterVolumeSpecName: "config-volume") pod "d19b4436-4c9b-4671-acef-1ba5685cb660" (UID: "d19b4436-4c9b-4671-acef-1ba5685cb660"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.660692 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19b4436-4c9b-4671-acef-1ba5685cb660-kube-api-access-zk5jt" (OuterVolumeSpecName: "kube-api-access-zk5jt") pod "d19b4436-4c9b-4671-acef-1ba5685cb660" (UID: "d19b4436-4c9b-4671-acef-1ba5685cb660"). InnerVolumeSpecName "kube-api-access-zk5jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.661246 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b4436-4c9b-4671-acef-1ba5685cb660-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d19b4436-4c9b-4671-acef-1ba5685cb660" (UID: "d19b4436-4c9b-4671-acef-1ba5685cb660"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.757939 4721 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b4436-4c9b-4671-acef-1ba5685cb660-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.757969 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk5jt\" (UniqueName: \"kubernetes.io/projected/d19b4436-4c9b-4671-acef-1ba5685cb660-kube-api-access-zk5jt\") on node \"crc\" DevicePath \"\"" Feb 02 13:30:03 crc kubenswrapper[4721]: I0202 13:30:03.757979 4721 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b4436-4c9b-4671-acef-1ba5685cb660-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:30:04 crc kubenswrapper[4721]: I0202 13:30:04.221851 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" event={"ID":"d19b4436-4c9b-4671-acef-1ba5685cb660","Type":"ContainerDied","Data":"34a0f251bb9e78b0bfa37fe20d6c84f3dbcc2bc274affa0b31a69e126f8a70a5"} Feb 02 13:30:04 crc kubenswrapper[4721]: I0202 13:30:04.222224 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34a0f251bb9e78b0bfa37fe20d6c84f3dbcc2bc274affa0b31a69e126f8a70a5" Feb 02 13:30:04 crc kubenswrapper[4721]: I0202 13:30:04.221899 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl" Feb 02 13:30:06 crc kubenswrapper[4721]: I0202 13:30:06.409820 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:30:06 crc kubenswrapper[4721]: E0202 13:30:06.410738 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:30:10 crc kubenswrapper[4721]: I0202 13:30:10.571457 4721 scope.go:117] "RemoveContainer" containerID="59e47087e25d7a69cc9b0e24b51c0193c1d130de3a6fbb82bf929574bc9e38b6" Feb 02 13:30:19 crc kubenswrapper[4721]: I0202 13:30:19.409860 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:30:19 crc kubenswrapper[4721]: E0202 13:30:19.410631 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:30:34 crc kubenswrapper[4721]: I0202 13:30:34.410834 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:30:34 crc kubenswrapper[4721]: E0202 13:30:34.411970 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:30:47 crc kubenswrapper[4721]: I0202 13:30:47.409845 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:30:47 crc kubenswrapper[4721]: E0202 13:30:47.410811 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:31:00 crc kubenswrapper[4721]: I0202 13:31:00.429549 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:31:00 crc kubenswrapper[4721]: E0202 13:31:00.432956 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:31:13 crc kubenswrapper[4721]: I0202 13:31:13.411032 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:31:13 crc kubenswrapper[4721]: E0202 13:31:13.412111 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:31:25 crc kubenswrapper[4721]: I0202 13:31:25.414943 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:31:25 crc kubenswrapper[4721]: E0202 13:31:25.416279 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:31:36 crc kubenswrapper[4721]: I0202 13:31:36.409752 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:31:36 crc kubenswrapper[4721]: E0202 13:31:36.410641 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:31:51 crc kubenswrapper[4721]: I0202 13:31:51.410084 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:31:51 crc kubenswrapper[4721]: E0202 13:31:51.410933 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:32:02 crc kubenswrapper[4721]: I0202 13:32:02.411477 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:32:02 crc kubenswrapper[4721]: E0202 13:32:02.412721 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:32:15 crc kubenswrapper[4721]: I0202 13:32:15.410353 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:32:15 crc kubenswrapper[4721]: E0202 13:32:15.411606 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:32:30 crc kubenswrapper[4721]: I0202 13:32:30.419058 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:32:30 crc kubenswrapper[4721]: E0202 13:32:30.419840 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:32:31 crc kubenswrapper[4721]: I0202 13:32:31.069300 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-04a7-account-create-update-xhlq8"] Feb 02 13:32:31 crc kubenswrapper[4721]: I0202 13:32:31.080375 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4d7hn"] Feb 02 13:32:31 crc kubenswrapper[4721]: I0202 13:32:31.091971 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-sl4gx"] Feb 02 13:32:31 crc kubenswrapper[4721]: I0202 13:32:31.102195 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1d7e-account-create-update-7jmk5"] Feb 02 13:32:31 crc kubenswrapper[4721]: I0202 13:32:31.112047 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-04a7-account-create-update-xhlq8"] Feb 02 13:32:31 crc kubenswrapper[4721]: I0202 13:32:31.121800 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-sl4gx"] Feb 02 13:32:31 crc kubenswrapper[4721]: I0202 13:32:31.131726 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1d7e-account-create-update-7jmk5"] Feb 02 13:32:31 crc kubenswrapper[4721]: I0202 13:32:31.141061 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-4d7hn"] Feb 02 13:32:32 crc kubenswrapper[4721]: I0202 13:32:32.424323 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1ef9e5-26ab-4b7b-b255-73968ed867ce" path="/var/lib/kubelet/pods/4e1ef9e5-26ab-4b7b-b255-73968ed867ce/volumes" Feb 02 13:32:32 crc kubenswrapper[4721]: I0202 13:32:32.427711 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b74e699-bc4f-4415-a9dc-8ad52d916bc0" path="/var/lib/kubelet/pods/8b74e699-bc4f-4415-a9dc-8ad52d916bc0/volumes" Feb 02 13:32:32 crc kubenswrapper[4721]: I0202 13:32:32.430686 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d46c6f8-aff0-4b28-a71b-d98a894afdaf" path="/var/lib/kubelet/pods/9d46c6f8-aff0-4b28-a71b-d98a894afdaf/volumes" Feb 02 13:32:32 crc kubenswrapper[4721]: I0202 13:32:32.431999 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e72b39-6085-4753-8b7d-a93a80c95d49" path="/var/lib/kubelet/pods/b5e72b39-6085-4753-8b7d-a93a80c95d49/volumes" Feb 02 13:32:35 crc kubenswrapper[4721]: I0202 13:32:35.056462 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-b945b"] Feb 02 13:32:35 crc kubenswrapper[4721]: I0202 13:32:35.071305 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-e588-account-create-update-4crm9"] Feb 02 13:32:35 crc kubenswrapper[4721]: I0202 13:32:35.082580 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-b945b"] Feb 02 13:32:35 crc kubenswrapper[4721]: I0202 13:32:35.102760 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-e588-account-create-update-4crm9"] Feb 02 13:32:35 crc kubenswrapper[4721]: I0202 13:32:35.123827 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4msnh"] Feb 02 13:32:35 crc kubenswrapper[4721]: I0202 13:32:35.131436 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4msnh"] Feb 02 13:32:35 crc kubenswrapper[4721]: I0202 13:32:35.144363 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f5fd-account-create-update-8s8md"] Feb 02 13:32:35 crc kubenswrapper[4721]: I0202 13:32:35.159579 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f5fd-account-create-update-8s8md"] Feb 02 13:32:36 crc kubenswrapper[4721]: I0202 13:32:36.422648 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd" path="/var/lib/kubelet/pods/3d4fa2b7-60f7-4bca-b4a6-751dc5f853dd/volumes" Feb 02 13:32:36 crc kubenswrapper[4721]: I0202 13:32:36.428234 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8261a2f3-c66a-441c-9fc6-a7a6a744b8a3" path="/var/lib/kubelet/pods/8261a2f3-c66a-441c-9fc6-a7a6a744b8a3/volumes" Feb 02 13:32:36 crc kubenswrapper[4721]: I0202 13:32:36.430954 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d51234ae-bf99-49bc-a3bc-1b392f993726" path="/var/lib/kubelet/pods/d51234ae-bf99-49bc-a3bc-1b392f993726/volumes" Feb 02 13:32:36 crc kubenswrapper[4721]: I0202 13:32:36.433213 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e071a9e9-d1fa-41c2-a0b4-3ddc2470055b" path="/var/lib/kubelet/pods/e071a9e9-d1fa-41c2-a0b4-3ddc2470055b/volumes" Feb 02 13:32:41 crc kubenswrapper[4721]: I0202 13:32:41.078672 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5"] Feb 02 13:32:41 crc kubenswrapper[4721]: I0202 13:32:41.092403 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-ed80-account-create-update-w8c4k"] Feb 02 13:32:41 crc kubenswrapper[4721]: I0202 13:32:41.101965 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-4hfg5"] Feb 02 13:32:41 crc kubenswrapper[4721]: I0202 13:32:41.112271 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-ed80-account-create-update-w8c4k"] Feb 02 13:32:42 crc kubenswrapper[4721]: I0202 13:32:42.443658 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af979c8-207f-455c-b383-fd22b1ec6758" path="/var/lib/kubelet/pods/0af979c8-207f-455c-b383-fd22b1ec6758/volumes" Feb 02 13:32:42 crc kubenswrapper[4721]: I0202 13:32:42.445660 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b1f70a8-6b41-4823-991b-934510a608fd" path="/var/lib/kubelet/pods/5b1f70a8-6b41-4823-991b-934510a608fd/volumes" Feb 02 13:32:45 crc kubenswrapper[4721]: I0202 13:32:45.410395 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:32:45 crc kubenswrapper[4721]: E0202 13:32:45.411167 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:32:58 crc kubenswrapper[4721]: I0202 13:32:58.454582 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:32:58 crc kubenswrapper[4721]: E0202 13:32:58.455605 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.048591 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-wffvl"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.064188 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9865-account-create-update-5xd7v"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.079830 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-wffvl"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.091386 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9865-account-create-update-5xd7v"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.100996 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-xmp7t"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.112311 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-xmp7t"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.123215 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2whnq"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.137307 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2whnq"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.155136 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-219b-account-create-update-c48ml"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.165531 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-15d5-account-create-update-5kl6r"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.176640 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-219b-account-create-update-c48ml"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.190231 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-15d5-account-create-update-5kl6r"] Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.435300 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13666544-a226-43ee-84c9-3232e9fff8d4" path="/var/lib/kubelet/pods/13666544-a226-43ee-84c9-3232e9fff8d4/volumes" Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.436564 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375b0aad-b921-41d8-af30-181ac4a73c0b" path="/var/lib/kubelet/pods/375b0aad-b921-41d8-af30-181ac4a73c0b/volumes" Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.437728 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b707f0-c9cf-46b5-b615-4c0ab1da0391" path="/var/lib/kubelet/pods/46b707f0-c9cf-46b5-b615-4c0ab1da0391/volumes" Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.438614 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51bc4821-8b8e-4972-a90e-67a7a7b1fee5" path="/var/lib/kubelet/pods/51bc4821-8b8e-4972-a90e-67a7a7b1fee5/volumes" Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.440028 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f56b66-72ae-4c95-8051-dc5f7a0faec4" path="/var/lib/kubelet/pods/67f56b66-72ae-4c95-8051-dc5f7a0faec4/volumes" Feb 02 13:33:04 crc kubenswrapper[4721]: I0202 13:33:04.441243 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="937d142a-7868-4de2-85f3-90dcc5a74019" path="/var/lib/kubelet/pods/937d142a-7868-4de2-85f3-90dcc5a74019/volumes" Feb 02 13:33:05 crc kubenswrapper[4721]: I0202 13:33:05.052930 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-6f5e-account-create-update-nk8v6"] Feb 02 13:33:05 crc kubenswrapper[4721]: I0202 13:33:05.067457 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-xs4g5"] Feb 02 13:33:05 crc kubenswrapper[4721]: I0202 13:33:05.078608 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-xs4g5"] Feb 02 13:33:05 crc kubenswrapper[4721]: I0202 13:33:05.090508 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-6f5e-account-create-update-nk8v6"] Feb 02 13:33:06 crc kubenswrapper[4721]: I0202 13:33:06.425869 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d74a2f0-9f60-4f59-92e4-11b9136f1db5" path="/var/lib/kubelet/pods/0d74a2f0-9f60-4f59-92e4-11b9136f1db5/volumes" Feb 02 13:33:06 crc kubenswrapper[4721]: I0202 13:33:06.427568 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ad4533-c6a5-49da-8f33-23113f8b7fea" path="/var/lib/kubelet/pods/45ad4533-c6a5-49da-8f33-23113f8b7fea/volumes" Feb 02 13:33:09 crc kubenswrapper[4721]: I0202 13:33:09.409508 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:33:09 crc kubenswrapper[4721]: E0202 13:33:09.410159 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:33:10 crc kubenswrapper[4721]: I0202 13:33:10.721314 4721 scope.go:117] "RemoveContainer" containerID="c60a5b8d8e8a5c3310b5b5e67ee44c42a8f4c0e6c7827776fd046304bab7b307" Feb 02 13:33:10 crc kubenswrapper[4721]: I0202 13:33:10.770082 4721 scope.go:117] "RemoveContainer" containerID="c036ed84a2cba404110a4db04b8c7d0f021199196a70d367772128ca1a327056" Feb 02 13:33:10 crc kubenswrapper[4721]: I0202 13:33:10.828178 4721 scope.go:117] "RemoveContainer" containerID="3fc0efffdb16822dab9442b0b34ece997fbe26ebe5808aadd9adb66a693a7dd5" Feb 02 13:33:10 crc kubenswrapper[4721]: I0202 13:33:10.881478 4721 scope.go:117] "RemoveContainer" containerID="ac2082a5d3a7b825797912c8d9660423dc0e0e1d5b6ff60e8c46690201c145fc" Feb 02 13:33:10 crc kubenswrapper[4721]: I0202 13:33:10.911591 4721 scope.go:117] "RemoveContainer" containerID="075500f50301a059fcb64d3fd73cd025d86472e12f54ce995ede7cc3876a10cc" Feb 02 13:33:10 crc kubenswrapper[4721]: I0202 13:33:10.974988 4721 scope.go:117] "RemoveContainer" containerID="4b856042309c84ea0a6e8c400ccf627a4542f6766de95577e1154ef8996d41d0" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.032435 4721 scope.go:117] "RemoveContainer" containerID="fdd34df09b38c8d4122669f8f4cba7de1072b0822ab1da7e4f47ca0b7bdd4576" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.053883 4721 scope.go:117] "RemoveContainer" containerID="c6d0cc979d5c7bfcd7c17e38f66db5aa66eb2098b9d5dff2ff5da7fb49088c43" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.122061 4721 scope.go:117] "RemoveContainer" containerID="4fba6951bb982a13a5360303ec96e98896da4f493023d6f3bda466f64f4a3da5" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.148561 4721 scope.go:117] "RemoveContainer" containerID="941cea547a9c6139f4cd34b3cba1a9232469327b41a70627da4554d99e83c28b" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.170221 4721 scope.go:117] "RemoveContainer" containerID="17ddfbe07f4d8bd38ac75c2dd4cd30a97224663bc355d41b2756657171654039" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.221260 4721 scope.go:117] "RemoveContainer" containerID="49e2f9d6a9f5b04c1ec533b19afe36a66018a912e5ef184f9b92ab178816de33" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.249511 4721 scope.go:117] "RemoveContainer" containerID="ce2892d88a036a25fa1c546c389b3e2c80e44cee3ffdb13ac0e6fe0ce93c414d" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.271599 4721 scope.go:117] "RemoveContainer" containerID="fa62cc31d8fc9109f8c7236f7067b1ae22093077c72a6872a6dc77d5cf6674c5" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.293389 4721 scope.go:117] "RemoveContainer" containerID="2e794fb911c91ba362224447e9ecd49d098fbbb2f1fb15a45aedc118537561bc" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.315516 4721 scope.go:117] "RemoveContainer" containerID="2e42426b62e4be73df9f47cc7c9f8475c22ad72a27299b42b9cb2460af93ca8b" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.344547 4721 scope.go:117] "RemoveContainer" containerID="b4ecf1fb05394c16a116b372fdffdfa0e7375e6cf9e5a5e825266b2a826c68fd" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.375181 4721 scope.go:117] "RemoveContainer" containerID="11d4406ea2aeef5800b1d48d5c16350e8f64df4bb7e540c2b8bb59f164e7298a" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.403038 4721 scope.go:117] "RemoveContainer" containerID="27b5beb216b40a6bf47c26d5501508f64c30a9831dd1df313dcf922bbdd6bfbf" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.430795 4721 scope.go:117] "RemoveContainer" containerID="041fca898cdc3bff357d1c5b88b2d8189fb9511b9b78d7578e5238edecd243a2" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.460564 4721 scope.go:117] "RemoveContainer" containerID="ebaaea23221e663dcf2fea49186c159f19607ea1b1253a2c8f769a54c895b470" Feb 02 13:33:11 crc kubenswrapper[4721]: I0202 13:33:11.516528 4721 scope.go:117] "RemoveContainer" containerID="786d3f2324d84717b8e858d54067e88e6ae7b7e91f7a2cafa3e176317085dd6f" Feb 02 13:33:18 crc kubenswrapper[4721]: I0202 13:33:18.066178 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-wvp2n"] Feb 02 13:33:18 crc kubenswrapper[4721]: I0202 13:33:18.078837 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-wvp2n"] Feb 02 13:33:18 crc kubenswrapper[4721]: I0202 13:33:18.425448 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3395efa1-7b43-4b48-9e06-764b9428c5ab" path="/var/lib/kubelet/pods/3395efa1-7b43-4b48-9e06-764b9428c5ab/volumes" Feb 02 13:33:20 crc kubenswrapper[4721]: I0202 13:33:20.424713 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:33:20 crc kubenswrapper[4721]: E0202 13:33:20.426322 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:33:24 crc kubenswrapper[4721]: I0202 13:33:24.039377 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-h675n"] Feb 02 13:33:24 crc kubenswrapper[4721]: I0202 13:33:24.055057 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-h675n"] Feb 02 13:33:24 crc kubenswrapper[4721]: I0202 13:33:24.429585 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ef45b1-9ff2-40ca-950a-07746f51eca9" path="/var/lib/kubelet/pods/71ef45b1-9ff2-40ca-950a-07746f51eca9/volumes" Feb 02 13:33:28 crc kubenswrapper[4721]: I0202 13:33:28.041927 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hrqtc"] Feb 02 13:33:28 crc kubenswrapper[4721]: I0202 13:33:28.060312 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hrqtc"] Feb 02 13:33:28 crc kubenswrapper[4721]: I0202 13:33:28.424871 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0531b398-2d44-42c2-bd6c-9e9f7ab8c85d" path="/var/lib/kubelet/pods/0531b398-2d44-42c2-bd6c-9e9f7ab8c85d/volumes" Feb 02 13:33:35 crc kubenswrapper[4721]: I0202 13:33:35.409986 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:33:35 crc kubenswrapper[4721]: E0202 13:33:35.410662 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:33:49 crc kubenswrapper[4721]: I0202 13:33:49.410351 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:33:49 crc kubenswrapper[4721]: E0202 13:33:49.411279 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:33:50 crc kubenswrapper[4721]: I0202 13:33:50.066631 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-2tnbk"] Feb 02 13:33:50 crc kubenswrapper[4721]: I0202 13:33:50.079211 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-2tnbk"] Feb 02 13:33:50 crc kubenswrapper[4721]: I0202 13:33:50.422858 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="026bbe7a-aec9-40ee-9be3-cdb35054e076" path="/var/lib/kubelet/pods/026bbe7a-aec9-40ee-9be3-cdb35054e076/volumes" Feb 02 13:33:57 crc kubenswrapper[4721]: I0202 13:33:57.029794 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-86z2v"] Feb 02 13:33:57 crc kubenswrapper[4721]: I0202 13:33:57.039837 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-86z2v"] Feb 02 13:33:58 crc kubenswrapper[4721]: I0202 13:33:58.432439 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdd67c16-7130-4095-952f-006aa5bcd5bb" path="/var/lib/kubelet/pods/bdd67c16-7130-4095-952f-006aa5bcd5bb/volumes" Feb 02 13:34:03 crc kubenswrapper[4721]: I0202 13:34:03.410600 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:34:03 crc kubenswrapper[4721]: E0202 13:34:03.411377 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:34:04 crc kubenswrapper[4721]: I0202 13:34:04.031994 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dw7nl"] Feb 02 13:34:04 crc kubenswrapper[4721]: I0202 13:34:04.043682 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dw7nl"] Feb 02 13:34:04 crc kubenswrapper[4721]: I0202 13:34:04.425908 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d168e414-ab7e-45ad-b142-25dcc1c359b0" path="/var/lib/kubelet/pods/d168e414-ab7e-45ad-b142-25dcc1c359b0/volumes" Feb 02 13:34:12 crc kubenswrapper[4721]: I0202 13:34:12.015355 4721 scope.go:117] "RemoveContainer" containerID="e7d18b6bc119712cab584a04197b6912f210c5505b0b18fd032278cec2f8f3b5" Feb 02 13:34:12 crc kubenswrapper[4721]: I0202 13:34:12.057574 4721 scope.go:117] "RemoveContainer" containerID="08a73cbae26287f30a607e9d3bb9b367097d0316dac19c52bb303a922febd87c" Feb 02 13:34:12 crc kubenswrapper[4721]: I0202 13:34:12.112865 4721 scope.go:117] "RemoveContainer" containerID="3705d645077158cd12edf8f0f9b5a39f0ba95d5854f57c056a964be7f2bc24c9" Feb 02 13:34:12 crc kubenswrapper[4721]: I0202 13:34:12.162949 4721 scope.go:117] "RemoveContainer" containerID="759c834b6a1cc62188124483c6831d2bab037f76c9aac624de4118e2066fe35a" Feb 02 13:34:12 crc kubenswrapper[4721]: I0202 13:34:12.267532 4721 scope.go:117] "RemoveContainer" containerID="3bcf52e41be39651ef275074658dfd7b224ff525c369aa709ad250ff99c12eb1" Feb 02 13:34:12 crc kubenswrapper[4721]: I0202 13:34:12.299576 4721 scope.go:117] "RemoveContainer" containerID="4fe33e5a77c87b36e6e67a1b771c77b2a949e44a54e8c2f88295e47c2b68d215" Feb 02 13:34:16 crc kubenswrapper[4721]: I0202 13:34:16.410435 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:34:17 crc kubenswrapper[4721]: I0202 13:34:17.119672 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"8db65d587513bf606777ac46eb9a8dd677cd18af3524da4a78bb404625c5d58f"} Feb 02 13:34:18 crc kubenswrapper[4721]: I0202 13:34:18.061197 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-cgqfl"] Feb 02 13:34:18 crc kubenswrapper[4721]: I0202 13:34:18.082202 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-cgqfl"] Feb 02 13:34:18 crc kubenswrapper[4721]: I0202 13:34:18.424870 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47a4176b-5f58-47a9-a614-e5d05526da18" path="/var/lib/kubelet/pods/47a4176b-5f58-47a9-a614-e5d05526da18/volumes" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.487692 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fmst6"] Feb 02 13:34:20 crc kubenswrapper[4721]: E0202 13:34:20.489235 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19b4436-4c9b-4671-acef-1ba5685cb660" containerName="collect-profiles" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.489260 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b4436-4c9b-4671-acef-1ba5685cb660" containerName="collect-profiles" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.489635 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19b4436-4c9b-4671-acef-1ba5685cb660" containerName="collect-profiles" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.492524 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.513493 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmst6"] Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.588742 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8phmt\" (UniqueName: \"kubernetes.io/projected/8972ee68-f222-49f3-8c06-5ba78388a6cd-kube-api-access-8phmt\") pod \"redhat-operators-fmst6\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.588828 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-catalog-content\") pod \"redhat-operators-fmst6\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.589269 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-utilities\") pod \"redhat-operators-fmst6\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.691946 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8phmt\" (UniqueName: \"kubernetes.io/projected/8972ee68-f222-49f3-8c06-5ba78388a6cd-kube-api-access-8phmt\") pod \"redhat-operators-fmst6\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.692025 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-catalog-content\") pod \"redhat-operators-fmst6\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.692201 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-utilities\") pod \"redhat-operators-fmst6\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.692669 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-catalog-content\") pod \"redhat-operators-fmst6\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.692683 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-utilities\") pod \"redhat-operators-fmst6\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.713136 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8phmt\" (UniqueName: \"kubernetes.io/projected/8972ee68-f222-49f3-8c06-5ba78388a6cd-kube-api-access-8phmt\") pod \"redhat-operators-fmst6\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:20 crc kubenswrapper[4721]: I0202 13:34:20.816132 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.098157 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84s2n"] Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.100895 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.111989 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84s2n"] Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.220505 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbp4d\" (UniqueName: \"kubernetes.io/projected/0d44a404-e01d-4a19-a103-58cbbafbdc7b-kube-api-access-zbp4d\") pod \"community-operators-84s2n\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.220672 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-utilities\") pod \"community-operators-84s2n\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.220804 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-catalog-content\") pod \"community-operators-84s2n\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.322730 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-catalog-content\") pod \"community-operators-84s2n\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.322855 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbp4d\" (UniqueName: \"kubernetes.io/projected/0d44a404-e01d-4a19-a103-58cbbafbdc7b-kube-api-access-zbp4d\") pod \"community-operators-84s2n\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.322936 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-utilities\") pod \"community-operators-84s2n\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.323508 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-catalog-content\") pod \"community-operators-84s2n\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.323535 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-utilities\") pod \"community-operators-84s2n\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.357056 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbp4d\" (UniqueName: \"kubernetes.io/projected/0d44a404-e01d-4a19-a103-58cbbafbdc7b-kube-api-access-zbp4d\") pod \"community-operators-84s2n\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.431802 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:21 crc kubenswrapper[4721]: W0202 13:34:21.434997 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8972ee68_f222_49f3_8c06_5ba78388a6cd.slice/crio-7d900a7ae12137341c4c93ae85dcbc015a3fb6c748d34ccdb600fe258145bb48 WatchSource:0}: Error finding container 7d900a7ae12137341c4c93ae85dcbc015a3fb6c748d34ccdb600fe258145bb48: Status 404 returned error can't find the container with id 7d900a7ae12137341c4c93ae85dcbc015a3fb6c748d34ccdb600fe258145bb48 Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.439669 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmst6"] Feb 02 13:34:21 crc kubenswrapper[4721]: I0202 13:34:21.973005 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84s2n"] Feb 02 13:34:22 crc kubenswrapper[4721]: I0202 13:34:22.211869 4721 generic.go:334] "Generic (PLEG): container finished" podID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerID="f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9" exitCode=0 Feb 02 13:34:22 crc kubenswrapper[4721]: I0202 13:34:22.211953 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmst6" event={"ID":"8972ee68-f222-49f3-8c06-5ba78388a6cd","Type":"ContainerDied","Data":"f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9"} Feb 02 13:34:22 crc kubenswrapper[4721]: I0202 13:34:22.211984 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmst6" event={"ID":"8972ee68-f222-49f3-8c06-5ba78388a6cd","Type":"ContainerStarted","Data":"7d900a7ae12137341c4c93ae85dcbc015a3fb6c748d34ccdb600fe258145bb48"} Feb 02 13:34:22 crc kubenswrapper[4721]: I0202 13:34:22.217526 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84s2n" event={"ID":"0d44a404-e01d-4a19-a103-58cbbafbdc7b","Type":"ContainerStarted","Data":"1a886770e742da309dc26755bb066650eb49c943ca412e90da6eb83afff91090"} Feb 02 13:34:23 crc kubenswrapper[4721]: I0202 13:34:23.234970 4721 generic.go:334] "Generic (PLEG): container finished" podID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerID="7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b" exitCode=0 Feb 02 13:34:23 crc kubenswrapper[4721]: I0202 13:34:23.235037 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84s2n" event={"ID":"0d44a404-e01d-4a19-a103-58cbbafbdc7b","Type":"ContainerDied","Data":"7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b"} Feb 02 13:34:23 crc kubenswrapper[4721]: I0202 13:34:23.238924 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:34:24 crc kubenswrapper[4721]: I0202 13:34:24.249108 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84s2n" event={"ID":"0d44a404-e01d-4a19-a103-58cbbafbdc7b","Type":"ContainerStarted","Data":"3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c"} Feb 02 13:34:25 crc kubenswrapper[4721]: I0202 13:34:25.270368 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmst6" event={"ID":"8972ee68-f222-49f3-8c06-5ba78388a6cd","Type":"ContainerStarted","Data":"d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27"} Feb 02 13:34:27 crc kubenswrapper[4721]: I0202 13:34:27.029350 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-n52pp"] Feb 02 13:34:27 crc kubenswrapper[4721]: I0202 13:34:27.042457 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-n52pp"] Feb 02 13:34:27 crc kubenswrapper[4721]: I0202 13:34:27.305031 4721 generic.go:334] "Generic (PLEG): container finished" podID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerID="3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c" exitCode=0 Feb 02 13:34:27 crc kubenswrapper[4721]: I0202 13:34:27.305105 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84s2n" event={"ID":"0d44a404-e01d-4a19-a103-58cbbafbdc7b","Type":"ContainerDied","Data":"3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c"} Feb 02 13:34:28 crc kubenswrapper[4721]: I0202 13:34:28.044651 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-7wjxh"] Feb 02 13:34:28 crc kubenswrapper[4721]: I0202 13:34:28.065756 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-7wjxh"] Feb 02 13:34:28 crc kubenswrapper[4721]: I0202 13:34:28.316015 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84s2n" event={"ID":"0d44a404-e01d-4a19-a103-58cbbafbdc7b","Type":"ContainerStarted","Data":"cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33"} Feb 02 13:34:28 crc kubenswrapper[4721]: I0202 13:34:28.370430 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84s2n" podStartSLOduration=2.873681278 podStartE2EDuration="7.3704104s" podCreationTimestamp="2026-02-02 13:34:21 +0000 UTC" firstStartedPulling="2026-02-02 13:34:23.238563128 +0000 UTC m=+2003.541077517" lastFinishedPulling="2026-02-02 13:34:27.73529225 +0000 UTC m=+2008.037806639" observedRunningTime="2026-02-02 13:34:28.364775587 +0000 UTC m=+2008.667289986" watchObservedRunningTime="2026-02-02 13:34:28.3704104 +0000 UTC m=+2008.672924789" Feb 02 13:34:28 crc kubenswrapper[4721]: I0202 13:34:28.424506 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa244a8-7588-4d87-bd5b-cbcd10780c83" path="/var/lib/kubelet/pods/9fa244a8-7588-4d87-bd5b-cbcd10780c83/volumes" Feb 02 13:34:28 crc kubenswrapper[4721]: I0202 13:34:28.425365 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3578ef-5d1b-4c52-939c-237feadc1c5c" path="/var/lib/kubelet/pods/ad3578ef-5d1b-4c52-939c-237feadc1c5c/volumes" Feb 02 13:34:31 crc kubenswrapper[4721]: I0202 13:34:31.369773 4721 generic.go:334] "Generic (PLEG): container finished" podID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerID="d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27" exitCode=0 Feb 02 13:34:31 crc kubenswrapper[4721]: I0202 13:34:31.369977 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmst6" event={"ID":"8972ee68-f222-49f3-8c06-5ba78388a6cd","Type":"ContainerDied","Data":"d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27"} Feb 02 13:34:31 crc kubenswrapper[4721]: I0202 13:34:31.433422 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:31 crc kubenswrapper[4721]: I0202 13:34:31.433462 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:32 crc kubenswrapper[4721]: I0202 13:34:32.383221 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmst6" event={"ID":"8972ee68-f222-49f3-8c06-5ba78388a6cd","Type":"ContainerStarted","Data":"74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763"} Feb 02 13:34:32 crc kubenswrapper[4721]: I0202 13:34:32.415670 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fmst6" podStartSLOduration=3.748827215 podStartE2EDuration="12.415650316s" podCreationTimestamp="2026-02-02 13:34:20 +0000 UTC" firstStartedPulling="2026-02-02 13:34:23.23864753 +0000 UTC m=+2003.541161959" lastFinishedPulling="2026-02-02 13:34:31.905470631 +0000 UTC m=+2012.207985060" observedRunningTime="2026-02-02 13:34:32.413793195 +0000 UTC m=+2012.716307594" watchObservedRunningTime="2026-02-02 13:34:32.415650316 +0000 UTC m=+2012.718164705" Feb 02 13:34:32 crc kubenswrapper[4721]: I0202 13:34:32.494299 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-84s2n" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="registry-server" probeResult="failure" output=< Feb 02 13:34:32 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:34:32 crc kubenswrapper[4721]: > Feb 02 13:34:40 crc kubenswrapper[4721]: I0202 13:34:40.817099 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:40 crc kubenswrapper[4721]: I0202 13:34:40.817685 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:34:41 crc kubenswrapper[4721]: I0202 13:34:41.887745 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fmst6" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="registry-server" probeResult="failure" output=< Feb 02 13:34:41 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:34:41 crc kubenswrapper[4721]: > Feb 02 13:34:42 crc kubenswrapper[4721]: I0202 13:34:42.503661 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-84s2n" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="registry-server" probeResult="failure" output=< Feb 02 13:34:42 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:34:42 crc kubenswrapper[4721]: > Feb 02 13:34:51 crc kubenswrapper[4721]: I0202 13:34:51.480002 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:51 crc kubenswrapper[4721]: I0202 13:34:51.533989 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:51 crc kubenswrapper[4721]: I0202 13:34:51.740961 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84s2n"] Feb 02 13:34:51 crc kubenswrapper[4721]: I0202 13:34:51.897812 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fmst6" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="registry-server" probeResult="failure" output=< Feb 02 13:34:51 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:34:51 crc kubenswrapper[4721]: > Feb 02 13:34:52 crc kubenswrapper[4721]: I0202 13:34:52.613481 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84s2n" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="registry-server" containerID="cri-o://cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33" gracePeriod=2 Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.438640 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.580352 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-catalog-content\") pod \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.580870 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-utilities\") pod \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.581006 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbp4d\" (UniqueName: \"kubernetes.io/projected/0d44a404-e01d-4a19-a103-58cbbafbdc7b-kube-api-access-zbp4d\") pod \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\" (UID: \"0d44a404-e01d-4a19-a103-58cbbafbdc7b\") " Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.582847 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-utilities" (OuterVolumeSpecName: "utilities") pod "0d44a404-e01d-4a19-a103-58cbbafbdc7b" (UID: "0d44a404-e01d-4a19-a103-58cbbafbdc7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.603321 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d44a404-e01d-4a19-a103-58cbbafbdc7b-kube-api-access-zbp4d" (OuterVolumeSpecName: "kube-api-access-zbp4d") pod "0d44a404-e01d-4a19-a103-58cbbafbdc7b" (UID: "0d44a404-e01d-4a19-a103-58cbbafbdc7b"). InnerVolumeSpecName "kube-api-access-zbp4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.629314 4721 generic.go:334] "Generic (PLEG): container finished" podID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerID="cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33" exitCode=0 Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.629363 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84s2n" event={"ID":"0d44a404-e01d-4a19-a103-58cbbafbdc7b","Type":"ContainerDied","Data":"cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33"} Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.629387 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84s2n" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.629409 4721 scope.go:117] "RemoveContainer" containerID="cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.629394 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84s2n" event={"ID":"0d44a404-e01d-4a19-a103-58cbbafbdc7b","Type":"ContainerDied","Data":"1a886770e742da309dc26755bb066650eb49c943ca412e90da6eb83afff91090"} Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.635090 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d44a404-e01d-4a19-a103-58cbbafbdc7b" (UID: "0d44a404-e01d-4a19-a103-58cbbafbdc7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.684270 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.684310 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d44a404-e01d-4a19-a103-58cbbafbdc7b-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.684325 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbp4d\" (UniqueName: \"kubernetes.io/projected/0d44a404-e01d-4a19-a103-58cbbafbdc7b-kube-api-access-zbp4d\") on node \"crc\" DevicePath \"\"" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.685934 4721 scope.go:117] "RemoveContainer" containerID="3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.721988 4721 scope.go:117] "RemoveContainer" containerID="7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.777491 4721 scope.go:117] "RemoveContainer" containerID="cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33" Feb 02 13:34:53 crc kubenswrapper[4721]: E0202 13:34:53.778062 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33\": container with ID starting with cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33 not found: ID does not exist" containerID="cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.778196 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33"} err="failed to get container status \"cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33\": rpc error: code = NotFound desc = could not find container \"cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33\": container with ID starting with cf31c21a88a57036453ab7c75fbf9e8abaeb723450dd869a3f193c5095004d33 not found: ID does not exist" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.778226 4721 scope.go:117] "RemoveContainer" containerID="3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c" Feb 02 13:34:53 crc kubenswrapper[4721]: E0202 13:34:53.778727 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c\": container with ID starting with 3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c not found: ID does not exist" containerID="3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.778784 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c"} err="failed to get container status \"3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c\": rpc error: code = NotFound desc = could not find container \"3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c\": container with ID starting with 3b43df588e2c12e4ac5c0b8c1354ac40786a7261c60c361f52e9f9fe304fcf3c not found: ID does not exist" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.778812 4721 scope.go:117] "RemoveContainer" containerID="7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b" Feb 02 13:34:53 crc kubenswrapper[4721]: E0202 13:34:53.779128 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b\": container with ID starting with 7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b not found: ID does not exist" containerID="7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.779231 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b"} err="failed to get container status \"7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b\": rpc error: code = NotFound desc = could not find container \"7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b\": container with ID starting with 7c97f99bfe6221d8ea66bdfad4046f68469ef4ad2afed261914d06d6d1c3a74b not found: ID does not exist" Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.959493 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84s2n"] Feb 02 13:34:53 crc kubenswrapper[4721]: I0202 13:34:53.968311 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84s2n"] Feb 02 13:34:54 crc kubenswrapper[4721]: I0202 13:34:54.433196 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" path="/var/lib/kubelet/pods/0d44a404-e01d-4a19-a103-58cbbafbdc7b/volumes" Feb 02 13:35:00 crc kubenswrapper[4721]: I0202 13:35:00.884344 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:35:00 crc kubenswrapper[4721]: I0202 13:35:00.937612 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:35:01 crc kubenswrapper[4721]: I0202 13:35:01.119352 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmst6"] Feb 02 13:35:02 crc kubenswrapper[4721]: I0202 13:35:02.728983 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fmst6" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="registry-server" containerID="cri-o://74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763" gracePeriod=2 Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.275473 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.394160 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-catalog-content\") pod \"8972ee68-f222-49f3-8c06-5ba78388a6cd\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.394468 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-utilities\") pod \"8972ee68-f222-49f3-8c06-5ba78388a6cd\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.395020 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8phmt\" (UniqueName: \"kubernetes.io/projected/8972ee68-f222-49f3-8c06-5ba78388a6cd-kube-api-access-8phmt\") pod \"8972ee68-f222-49f3-8c06-5ba78388a6cd\" (UID: \"8972ee68-f222-49f3-8c06-5ba78388a6cd\") " Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.396527 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-utilities" (OuterVolumeSpecName: "utilities") pod "8972ee68-f222-49f3-8c06-5ba78388a6cd" (UID: "8972ee68-f222-49f3-8c06-5ba78388a6cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.398265 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.411550 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8972ee68-f222-49f3-8c06-5ba78388a6cd-kube-api-access-8phmt" (OuterVolumeSpecName: "kube-api-access-8phmt") pod "8972ee68-f222-49f3-8c06-5ba78388a6cd" (UID: "8972ee68-f222-49f3-8c06-5ba78388a6cd"). InnerVolumeSpecName "kube-api-access-8phmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.499891 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8phmt\" (UniqueName: \"kubernetes.io/projected/8972ee68-f222-49f3-8c06-5ba78388a6cd-kube-api-access-8phmt\") on node \"crc\" DevicePath \"\"" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.543571 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8972ee68-f222-49f3-8c06-5ba78388a6cd" (UID: "8972ee68-f222-49f3-8c06-5ba78388a6cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.603118 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8972ee68-f222-49f3-8c06-5ba78388a6cd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.752278 4721 generic.go:334] "Generic (PLEG): container finished" podID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerID="74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763" exitCode=0 Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.752325 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmst6" event={"ID":"8972ee68-f222-49f3-8c06-5ba78388a6cd","Type":"ContainerDied","Data":"74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763"} Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.752357 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmst6" event={"ID":"8972ee68-f222-49f3-8c06-5ba78388a6cd","Type":"ContainerDied","Data":"7d900a7ae12137341c4c93ae85dcbc015a3fb6c748d34ccdb600fe258145bb48"} Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.752376 4721 scope.go:117] "RemoveContainer" containerID="74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.752394 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmst6" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.775978 4721 scope.go:117] "RemoveContainer" containerID="d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.814727 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmst6"] Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.835006 4721 scope.go:117] "RemoveContainer" containerID="f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.837355 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fmst6"] Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.879615 4721 scope.go:117] "RemoveContainer" containerID="74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763" Feb 02 13:35:03 crc kubenswrapper[4721]: E0202 13:35:03.880118 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763\": container with ID starting with 74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763 not found: ID does not exist" containerID="74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.880156 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763"} err="failed to get container status \"74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763\": rpc error: code = NotFound desc = could not find container \"74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763\": container with ID starting with 74a4c11b6ce3b5a6ca6b3a1f395910e994f318787b772befd6fcc6ba4a326763 not found: ID does not exist" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.880178 4721 scope.go:117] "RemoveContainer" containerID="d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27" Feb 02 13:35:03 crc kubenswrapper[4721]: E0202 13:35:03.880666 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27\": container with ID starting with d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27 not found: ID does not exist" containerID="d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.880720 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27"} err="failed to get container status \"d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27\": rpc error: code = NotFound desc = could not find container \"d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27\": container with ID starting with d4ec6deb0c199fe2932aa5da9887fd2837b3431d82684c3e3ecea9b39b062f27 not found: ID does not exist" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.880755 4721 scope.go:117] "RemoveContainer" containerID="f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9" Feb 02 13:35:03 crc kubenswrapper[4721]: E0202 13:35:03.881122 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9\": container with ID starting with f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9 not found: ID does not exist" containerID="f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9" Feb 02 13:35:03 crc kubenswrapper[4721]: I0202 13:35:03.881154 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9"} err="failed to get container status \"f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9\": rpc error: code = NotFound desc = could not find container \"f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9\": container with ID starting with f4f5a2a5cea46a62873fff8e7f1234ae4c404df64b0975cf55881d1c4fbdadb9 not found: ID does not exist" Feb 02 13:35:04 crc kubenswrapper[4721]: I0202 13:35:04.425555 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" path="/var/lib/kubelet/pods/8972ee68-f222-49f3-8c06-5ba78388a6cd/volumes" Feb 02 13:35:12 crc kubenswrapper[4721]: I0202 13:35:12.514527 4721 scope.go:117] "RemoveContainer" containerID="0b784cae8dddc21c2c3af89d409032d5d888340b53c786c1dc27d600d257dd2b" Feb 02 13:35:12 crc kubenswrapper[4721]: I0202 13:35:12.561722 4721 scope.go:117] "RemoveContainer" containerID="2f2b028f4f0c88964c0238ef71f7a14ee0e0d63a6586667e0a8a76c80b585914" Feb 02 13:35:12 crc kubenswrapper[4721]: I0202 13:35:12.625323 4721 scope.go:117] "RemoveContainer" containerID="ce2db44950c758448aaea5320ccdad1fe422fd10d5dc9377dff5887076136a7a" Feb 02 13:35:28 crc kubenswrapper[4721]: I0202 13:35:28.064389 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-22bg7"] Feb 02 13:35:28 crc kubenswrapper[4721]: I0202 13:35:28.078545 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-22bg7"] Feb 02 13:35:28 crc kubenswrapper[4721]: I0202 13:35:28.429504 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d33cb7-8d98-44cc-97ef-229d34805e46" path="/var/lib/kubelet/pods/31d33cb7-8d98-44cc-97ef-229d34805e46/volumes" Feb 02 13:35:29 crc kubenswrapper[4721]: I0202 13:35:29.049610 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-zbntf"] Feb 02 13:35:29 crc kubenswrapper[4721]: I0202 13:35:29.102287 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7hfxs"] Feb 02 13:35:29 crc kubenswrapper[4721]: I0202 13:35:29.121134 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-zbntf"] Feb 02 13:35:29 crc kubenswrapper[4721]: I0202 13:35:29.134928 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7hfxs"] Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.031898 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5c37-account-create-update-h9w2m"] Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.041591 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6b23-account-create-update-5q82h"] Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.053363 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ecf5-account-create-update-k6kdv"] Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.062889 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5c37-account-create-update-h9w2m"] Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.071433 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ecf5-account-create-update-k6kdv"] Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.081786 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6b23-account-create-update-5q82h"] Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.428707 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="011e7b6f-64eb-48b5-be89-8304581d4c5f" path="/var/lib/kubelet/pods/011e7b6f-64eb-48b5-be89-8304581d4c5f/volumes" Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.432173 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd874ae-fdb8-4f98-ae51-dac54a44e001" path="/var/lib/kubelet/pods/0dd874ae-fdb8-4f98-ae51-dac54a44e001/volumes" Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.433506 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="147719a3-96ca-4551-a395-648dd45b4ce6" path="/var/lib/kubelet/pods/147719a3-96ca-4551-a395-648dd45b4ce6/volumes" Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.435128 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e297f9-7808-4195-86b2-2c17f4638bf2" path="/var/lib/kubelet/pods/84e297f9-7808-4195-86b2-2c17f4638bf2/volumes" Feb 02 13:35:30 crc kubenswrapper[4721]: I0202 13:35:30.436513 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1" path="/var/lib/kubelet/pods/a5e633c7-bd54-4bcb-9f5c-73d6b059d0e1/volumes" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.019240 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gs6c5"] Feb 02 13:35:35 crc kubenswrapper[4721]: E0202 13:35:35.020624 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="extract-utilities" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.020658 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="extract-utilities" Feb 02 13:35:35 crc kubenswrapper[4721]: E0202 13:35:35.020690 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="extract-content" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.020704 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="extract-content" Feb 02 13:35:35 crc kubenswrapper[4721]: E0202 13:35:35.020757 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="registry-server" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.020772 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="registry-server" Feb 02 13:35:35 crc kubenswrapper[4721]: E0202 13:35:35.020817 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="registry-server" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.020830 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="registry-server" Feb 02 13:35:35 crc kubenswrapper[4721]: E0202 13:35:35.020851 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="extract-content" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.020863 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="extract-content" Feb 02 13:35:35 crc kubenswrapper[4721]: E0202 13:35:35.020885 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="extract-utilities" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.020897 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="extract-utilities" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.021348 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="8972ee68-f222-49f3-8c06-5ba78388a6cd" containerName="registry-server" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.021390 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d44a404-e01d-4a19-a103-58cbbafbdc7b" containerName="registry-server" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.025892 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.029181 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gs6c5"] Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.177366 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-catalog-content\") pod \"redhat-marketplace-gs6c5\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.177791 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-utilities\") pod \"redhat-marketplace-gs6c5\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.177848 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsd9z\" (UniqueName: \"kubernetes.io/projected/fb6fd6e8-1134-4152-ba9f-c13b2660d022-kube-api-access-hsd9z\") pod \"redhat-marketplace-gs6c5\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.280803 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-catalog-content\") pod \"redhat-marketplace-gs6c5\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.281237 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-utilities\") pod \"redhat-marketplace-gs6c5\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.281274 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-catalog-content\") pod \"redhat-marketplace-gs6c5\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.281295 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsd9z\" (UniqueName: \"kubernetes.io/projected/fb6fd6e8-1134-4152-ba9f-c13b2660d022-kube-api-access-hsd9z\") pod \"redhat-marketplace-gs6c5\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.281538 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-utilities\") pod \"redhat-marketplace-gs6c5\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.302803 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsd9z\" (UniqueName: \"kubernetes.io/projected/fb6fd6e8-1134-4152-ba9f-c13b2660d022-kube-api-access-hsd9z\") pod \"redhat-marketplace-gs6c5\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.366781 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:35 crc kubenswrapper[4721]: I0202 13:35:35.871003 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gs6c5"] Feb 02 13:35:35 crc kubenswrapper[4721]: W0202 13:35:35.875132 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb6fd6e8_1134_4152_ba9f_c13b2660d022.slice/crio-792dec176685fa30bf565a9c8042fc5dfa645b2c73f5444ff1b7551c0cbc3ab4 WatchSource:0}: Error finding container 792dec176685fa30bf565a9c8042fc5dfa645b2c73f5444ff1b7551c0cbc3ab4: Status 404 returned error can't find the container with id 792dec176685fa30bf565a9c8042fc5dfa645b2c73f5444ff1b7551c0cbc3ab4 Feb 02 13:35:36 crc kubenswrapper[4721]: I0202 13:35:36.143821 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gs6c5" event={"ID":"fb6fd6e8-1134-4152-ba9f-c13b2660d022","Type":"ContainerStarted","Data":"792dec176685fa30bf565a9c8042fc5dfa645b2c73f5444ff1b7551c0cbc3ab4"} Feb 02 13:35:37 crc kubenswrapper[4721]: I0202 13:35:37.162638 4721 generic.go:334] "Generic (PLEG): container finished" podID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerID="7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce" exitCode=0 Feb 02 13:35:37 crc kubenswrapper[4721]: I0202 13:35:37.162730 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gs6c5" event={"ID":"fb6fd6e8-1134-4152-ba9f-c13b2660d022","Type":"ContainerDied","Data":"7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce"} Feb 02 13:35:38 crc kubenswrapper[4721]: I0202 13:35:38.176525 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gs6c5" event={"ID":"fb6fd6e8-1134-4152-ba9f-c13b2660d022","Type":"ContainerStarted","Data":"7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885"} Feb 02 13:35:39 crc kubenswrapper[4721]: I0202 13:35:39.189156 4721 generic.go:334] "Generic (PLEG): container finished" podID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerID="7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885" exitCode=0 Feb 02 13:35:39 crc kubenswrapper[4721]: I0202 13:35:39.189230 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gs6c5" event={"ID":"fb6fd6e8-1134-4152-ba9f-c13b2660d022","Type":"ContainerDied","Data":"7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885"} Feb 02 13:35:40 crc kubenswrapper[4721]: I0202 13:35:40.204716 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gs6c5" event={"ID":"fb6fd6e8-1134-4152-ba9f-c13b2660d022","Type":"ContainerStarted","Data":"26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf"} Feb 02 13:35:40 crc kubenswrapper[4721]: I0202 13:35:40.240832 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gs6c5" podStartSLOduration=3.7900480500000002 podStartE2EDuration="6.240810497s" podCreationTimestamp="2026-02-02 13:35:34 +0000 UTC" firstStartedPulling="2026-02-02 13:35:37.165900028 +0000 UTC m=+2077.468414427" lastFinishedPulling="2026-02-02 13:35:39.616662445 +0000 UTC m=+2079.919176874" observedRunningTime="2026-02-02 13:35:40.226883357 +0000 UTC m=+2080.529397776" watchObservedRunningTime="2026-02-02 13:35:40.240810497 +0000 UTC m=+2080.543324886" Feb 02 13:35:45 crc kubenswrapper[4721]: I0202 13:35:45.367828 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:45 crc kubenswrapper[4721]: I0202 13:35:45.368399 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:45 crc kubenswrapper[4721]: I0202 13:35:45.428387 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:46 crc kubenswrapper[4721]: I0202 13:35:46.349932 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:46 crc kubenswrapper[4721]: I0202 13:35:46.426428 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gs6c5"] Feb 02 13:35:48 crc kubenswrapper[4721]: I0202 13:35:48.316783 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gs6c5" podUID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerName="registry-server" containerID="cri-o://26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf" gracePeriod=2 Feb 02 13:35:48 crc kubenswrapper[4721]: I0202 13:35:48.881550 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:48 crc kubenswrapper[4721]: I0202 13:35:48.973141 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsd9z\" (UniqueName: \"kubernetes.io/projected/fb6fd6e8-1134-4152-ba9f-c13b2660d022-kube-api-access-hsd9z\") pod \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " Feb 02 13:35:48 crc kubenswrapper[4721]: I0202 13:35:48.973234 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-catalog-content\") pod \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " Feb 02 13:35:48 crc kubenswrapper[4721]: I0202 13:35:48.973389 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-utilities\") pod \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\" (UID: \"fb6fd6e8-1134-4152-ba9f-c13b2660d022\") " Feb 02 13:35:48 crc kubenswrapper[4721]: I0202 13:35:48.974952 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-utilities" (OuterVolumeSpecName: "utilities") pod "fb6fd6e8-1134-4152-ba9f-c13b2660d022" (UID: "fb6fd6e8-1134-4152-ba9f-c13b2660d022"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:35:48 crc kubenswrapper[4721]: I0202 13:35:48.984061 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb6fd6e8-1134-4152-ba9f-c13b2660d022-kube-api-access-hsd9z" (OuterVolumeSpecName: "kube-api-access-hsd9z") pod "fb6fd6e8-1134-4152-ba9f-c13b2660d022" (UID: "fb6fd6e8-1134-4152-ba9f-c13b2660d022"). InnerVolumeSpecName "kube-api-access-hsd9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:35:48 crc kubenswrapper[4721]: I0202 13:35:48.997115 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb6fd6e8-1134-4152-ba9f-c13b2660d022" (UID: "fb6fd6e8-1134-4152-ba9f-c13b2660d022"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.076733 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsd9z\" (UniqueName: \"kubernetes.io/projected/fb6fd6e8-1134-4152-ba9f-c13b2660d022-kube-api-access-hsd9z\") on node \"crc\" DevicePath \"\"" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.076773 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.076787 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6fd6e8-1134-4152-ba9f-c13b2660d022-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.322059 4721 generic.go:334] "Generic (PLEG): container finished" podID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerID="26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf" exitCode=0 Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.322114 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gs6c5" event={"ID":"fb6fd6e8-1134-4152-ba9f-c13b2660d022","Type":"ContainerDied","Data":"26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf"} Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.322159 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gs6c5" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.322413 4721 scope.go:117] "RemoveContainer" containerID="26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.322397 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gs6c5" event={"ID":"fb6fd6e8-1134-4152-ba9f-c13b2660d022","Type":"ContainerDied","Data":"792dec176685fa30bf565a9c8042fc5dfa645b2c73f5444ff1b7551c0cbc3ab4"} Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.374686 4721 scope.go:117] "RemoveContainer" containerID="7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.381539 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gs6c5"] Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.397102 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gs6c5"] Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.398226 4721 scope.go:117] "RemoveContainer" containerID="7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.463979 4721 scope.go:117] "RemoveContainer" containerID="26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf" Feb 02 13:35:49 crc kubenswrapper[4721]: E0202 13:35:49.464463 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf\": container with ID starting with 26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf not found: ID does not exist" containerID="26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.464509 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf"} err="failed to get container status \"26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf\": rpc error: code = NotFound desc = could not find container \"26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf\": container with ID starting with 26276be656c22094ec0120095d3587af0bacd94837cef6adde92a89050e06bdf not found: ID does not exist" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.464535 4721 scope.go:117] "RemoveContainer" containerID="7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885" Feb 02 13:35:49 crc kubenswrapper[4721]: E0202 13:35:49.465100 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885\": container with ID starting with 7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885 not found: ID does not exist" containerID="7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.465139 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885"} err="failed to get container status \"7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885\": rpc error: code = NotFound desc = could not find container \"7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885\": container with ID starting with 7445ca06a745cdd7aa336fe3800ae4c4859529f0dbaf6a0d2eebc5dacedf8885 not found: ID does not exist" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.465164 4721 scope.go:117] "RemoveContainer" containerID="7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce" Feb 02 13:35:49 crc kubenswrapper[4721]: E0202 13:35:49.465618 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce\": container with ID starting with 7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce not found: ID does not exist" containerID="7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce" Feb 02 13:35:49 crc kubenswrapper[4721]: I0202 13:35:49.465656 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce"} err="failed to get container status \"7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce\": rpc error: code = NotFound desc = could not find container \"7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce\": container with ID starting with 7da1838c8886caa5f3234f5e18217a2f0dba246c557d9428c436ad49653d6fce not found: ID does not exist" Feb 02 13:35:50 crc kubenswrapper[4721]: I0202 13:35:50.428405 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" path="/var/lib/kubelet/pods/fb6fd6e8-1134-4152-ba9f-c13b2660d022/volumes" Feb 02 13:36:06 crc kubenswrapper[4721]: I0202 13:36:06.045606 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9h27j"] Feb 02 13:36:06 crc kubenswrapper[4721]: I0202 13:36:06.057032 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9h27j"] Feb 02 13:36:06 crc kubenswrapper[4721]: I0202 13:36:06.423344 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f01b253a-c7c6-4c9e-a800-a1732ba06f37" path="/var/lib/kubelet/pods/f01b253a-c7c6-4c9e-a800-a1732ba06f37/volumes" Feb 02 13:36:12 crc kubenswrapper[4721]: I0202 13:36:12.838527 4721 scope.go:117] "RemoveContainer" containerID="a5de70229867c13b14cef49c003c51ee5606bf312afc995295088db8627fee73" Feb 02 13:36:12 crc kubenswrapper[4721]: I0202 13:36:12.871452 4721 scope.go:117] "RemoveContainer" containerID="d2ceca0ae565540870f02401237596a428eda4dcf72c0158ff8c7116acbfc486" Feb 02 13:36:12 crc kubenswrapper[4721]: I0202 13:36:12.936857 4721 scope.go:117] "RemoveContainer" containerID="663f048cc96da678155c1a95cb9691e5043c9f03a4259611e0be7f358517c2f1" Feb 02 13:36:12 crc kubenswrapper[4721]: I0202 13:36:12.990115 4721 scope.go:117] "RemoveContainer" containerID="8b1af4c2f51b92e7d8e686eda8c46c518ebd4e4b694ff77617f39b7b376a484a" Feb 02 13:36:13 crc kubenswrapper[4721]: I0202 13:36:13.082610 4721 scope.go:117] "RemoveContainer" containerID="a4df6290a6ff822c9798aad4bb78bddad86f6ee3871a5520d115e6e491f3950e" Feb 02 13:36:13 crc kubenswrapper[4721]: I0202 13:36:13.136266 4721 scope.go:117] "RemoveContainer" containerID="9f52f888eba9daf5e6e283524ff7481c4a05eeb6d1ae52e82e3fc8b08b7473c5" Feb 02 13:36:13 crc kubenswrapper[4721]: I0202 13:36:13.197882 4721 scope.go:117] "RemoveContainer" containerID="4fc7b5047fba1d65d14acd76d647401f0b37510534baf6ea3bf2254dfd744004" Feb 02 13:36:15 crc kubenswrapper[4721]: I0202 13:36:15.075398 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-gq5tv"] Feb 02 13:36:15 crc kubenswrapper[4721]: I0202 13:36:15.090601 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-gq5tv"] Feb 02 13:36:16 crc kubenswrapper[4721]: I0202 13:36:16.047272 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-2c26-account-create-update-2r4tb"] Feb 02 13:36:16 crc kubenswrapper[4721]: I0202 13:36:16.057481 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-2c26-account-create-update-2r4tb"] Feb 02 13:36:16 crc kubenswrapper[4721]: I0202 13:36:16.424355 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="997707ef-4296-4151-9385-0fbb48b5e317" path="/var/lib/kubelet/pods/997707ef-4296-4151-9385-0fbb48b5e317/volumes" Feb 02 13:36:16 crc kubenswrapper[4721]: I0202 13:36:16.425172 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9" path="/var/lib/kubelet/pods/ad02c2c3-e07d-4ab9-8498-26e3c2bfdfb9/volumes" Feb 02 13:36:30 crc kubenswrapper[4721]: I0202 13:36:30.034789 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4rm59"] Feb 02 13:36:30 crc kubenswrapper[4721]: I0202 13:36:30.044862 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4rm59"] Feb 02 13:36:30 crc kubenswrapper[4721]: I0202 13:36:30.424326 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798dac79-94bd-4655-b409-4b173956cdbf" path="/var/lib/kubelet/pods/798dac79-94bd-4655-b409-4b173956cdbf/volumes" Feb 02 13:36:32 crc kubenswrapper[4721]: I0202 13:36:32.035285 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hclll"] Feb 02 13:36:32 crc kubenswrapper[4721]: I0202 13:36:32.055605 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hclll"] Feb 02 13:36:32 crc kubenswrapper[4721]: I0202 13:36:32.427386 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d38541e-139a-425e-a7bd-f7c484f7266b" path="/var/lib/kubelet/pods/6d38541e-139a-425e-a7bd-f7c484f7266b/volumes" Feb 02 13:36:44 crc kubenswrapper[4721]: I0202 13:36:44.763959 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:36:44 crc kubenswrapper[4721]: I0202 13:36:44.764436 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:37:13 crc kubenswrapper[4721]: I0202 13:37:13.355291 4721 scope.go:117] "RemoveContainer" containerID="986695a877ff6ee1eff46d199a1d9a03005ab3f7a3f83da3d9212653e0413f7c" Feb 02 13:37:13 crc kubenswrapper[4721]: I0202 13:37:13.426108 4721 scope.go:117] "RemoveContainer" containerID="2995f84a212135f6e3822d895963dca122d6d439797b7dcab3e6c3486dd7be70" Feb 02 13:37:13 crc kubenswrapper[4721]: I0202 13:37:13.485141 4721 scope.go:117] "RemoveContainer" containerID="7eef54336f3dd0e3ab875f1a48960393b687a8f714292e0019c4853d4094a4be" Feb 02 13:37:13 crc kubenswrapper[4721]: I0202 13:37:13.545881 4721 scope.go:117] "RemoveContainer" containerID="1857f80c5a621ba0a1377e0d92d2bafe33fbe9b8df8cc9bde107743c0bafe96d" Feb 02 13:37:14 crc kubenswrapper[4721]: I0202 13:37:14.763935 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:37:14 crc kubenswrapper[4721]: I0202 13:37:14.764003 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:37:16 crc kubenswrapper[4721]: I0202 13:37:16.057603 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-crdjt"] Feb 02 13:37:16 crc kubenswrapper[4721]: I0202 13:37:16.071590 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-crdjt"] Feb 02 13:37:16 crc kubenswrapper[4721]: I0202 13:37:16.425546 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27df0911-fe79-4339-a6fe-cf538f97a247" path="/var/lib/kubelet/pods/27df0911-fe79-4339-a6fe-cf538f97a247/volumes" Feb 02 13:37:44 crc kubenswrapper[4721]: I0202 13:37:44.765193 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:37:44 crc kubenswrapper[4721]: I0202 13:37:44.766184 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:37:44 crc kubenswrapper[4721]: I0202 13:37:44.766227 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:37:44 crc kubenswrapper[4721]: I0202 13:37:44.767170 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8db65d587513bf606777ac46eb9a8dd677cd18af3524da4a78bb404625c5d58f"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:37:44 crc kubenswrapper[4721]: I0202 13:37:44.767218 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://8db65d587513bf606777ac46eb9a8dd677cd18af3524da4a78bb404625c5d58f" gracePeriod=600 Feb 02 13:37:45 crc kubenswrapper[4721]: I0202 13:37:45.773286 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"8db65d587513bf606777ac46eb9a8dd677cd18af3524da4a78bb404625c5d58f"} Feb 02 13:37:45 crc kubenswrapper[4721]: I0202 13:37:45.773201 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="8db65d587513bf606777ac46eb9a8dd677cd18af3524da4a78bb404625c5d58f" exitCode=0 Feb 02 13:37:45 crc kubenswrapper[4721]: I0202 13:37:45.773778 4721 scope.go:117] "RemoveContainer" containerID="28d0d51ec03a49d1ae8e4dc5ae68b3f445ea6c6231be11219a2f3b614eea299c" Feb 02 13:37:45 crc kubenswrapper[4721]: I0202 13:37:45.773815 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375"} Feb 02 13:38:13 crc kubenswrapper[4721]: I0202 13:38:13.681719 4721 scope.go:117] "RemoveContainer" containerID="38697017d92e58f9ce89dc861401173c6d0237bce2d2e6a9a0bc00c9d2093bfd" Feb 02 13:40:14 crc kubenswrapper[4721]: I0202 13:40:14.763443 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:40:14 crc kubenswrapper[4721]: I0202 13:40:14.765778 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.246254 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-grdzv"] Feb 02 13:40:26 crc kubenswrapper[4721]: E0202 13:40:26.247611 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerName="registry-server" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.247628 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerName="registry-server" Feb 02 13:40:26 crc kubenswrapper[4721]: E0202 13:40:26.247666 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerName="extract-utilities" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.247689 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerName="extract-utilities" Feb 02 13:40:26 crc kubenswrapper[4721]: E0202 13:40:26.247736 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerName="extract-content" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.247745 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerName="extract-content" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.248001 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb6fd6e8-1134-4152-ba9f-c13b2660d022" containerName="registry-server" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.252345 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.273586 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-grdzv"] Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.406714 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7br\" (UniqueName: \"kubernetes.io/projected/fd185aef-3d37-4ad6-91f6-1470e8c39999-kube-api-access-kt7br\") pod \"certified-operators-grdzv\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.406788 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-utilities\") pod \"certified-operators-grdzv\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.407326 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-catalog-content\") pod \"certified-operators-grdzv\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.510342 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-catalog-content\") pod \"certified-operators-grdzv\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.511011 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7br\" (UniqueName: \"kubernetes.io/projected/fd185aef-3d37-4ad6-91f6-1470e8c39999-kube-api-access-kt7br\") pod \"certified-operators-grdzv\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.511063 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-utilities\") pod \"certified-operators-grdzv\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.511837 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-utilities\") pod \"certified-operators-grdzv\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.513696 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-catalog-content\") pod \"certified-operators-grdzv\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.558548 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7br\" (UniqueName: \"kubernetes.io/projected/fd185aef-3d37-4ad6-91f6-1470e8c39999-kube-api-access-kt7br\") pod \"certified-operators-grdzv\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:26 crc kubenswrapper[4721]: I0202 13:40:26.574545 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:27 crc kubenswrapper[4721]: I0202 13:40:27.095059 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-grdzv"] Feb 02 13:40:27 crc kubenswrapper[4721]: I0202 13:40:27.640335 4721 generic.go:334] "Generic (PLEG): container finished" podID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerID="eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20" exitCode=0 Feb 02 13:40:27 crc kubenswrapper[4721]: I0202 13:40:27.640400 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grdzv" event={"ID":"fd185aef-3d37-4ad6-91f6-1470e8c39999","Type":"ContainerDied","Data":"eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20"} Feb 02 13:40:27 crc kubenswrapper[4721]: I0202 13:40:27.640440 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grdzv" event={"ID":"fd185aef-3d37-4ad6-91f6-1470e8c39999","Type":"ContainerStarted","Data":"03f6acbdacdbf0004670972985b55556e86b1549a5f9eabecd04a5ea87091b90"} Feb 02 13:40:27 crc kubenswrapper[4721]: I0202 13:40:27.649139 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:40:29 crc kubenswrapper[4721]: I0202 13:40:29.670817 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grdzv" event={"ID":"fd185aef-3d37-4ad6-91f6-1470e8c39999","Type":"ContainerStarted","Data":"e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084"} Feb 02 13:40:30 crc kubenswrapper[4721]: I0202 13:40:30.683970 4721 generic.go:334] "Generic (PLEG): container finished" podID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerID="e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084" exitCode=0 Feb 02 13:40:30 crc kubenswrapper[4721]: I0202 13:40:30.684413 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grdzv" event={"ID":"fd185aef-3d37-4ad6-91f6-1470e8c39999","Type":"ContainerDied","Data":"e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084"} Feb 02 13:40:32 crc kubenswrapper[4721]: I0202 13:40:32.705576 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grdzv" event={"ID":"fd185aef-3d37-4ad6-91f6-1470e8c39999","Type":"ContainerStarted","Data":"278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb"} Feb 02 13:40:32 crc kubenswrapper[4721]: I0202 13:40:32.740764 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-grdzv" podStartSLOduration=2.9267386909999997 podStartE2EDuration="6.740738327s" podCreationTimestamp="2026-02-02 13:40:26 +0000 UTC" firstStartedPulling="2026-02-02 13:40:27.648830715 +0000 UTC m=+2367.951345104" lastFinishedPulling="2026-02-02 13:40:31.462830341 +0000 UTC m=+2371.765344740" observedRunningTime="2026-02-02 13:40:32.737625913 +0000 UTC m=+2373.040140402" watchObservedRunningTime="2026-02-02 13:40:32.740738327 +0000 UTC m=+2373.043252726" Feb 02 13:40:36 crc kubenswrapper[4721]: I0202 13:40:36.574682 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:36 crc kubenswrapper[4721]: I0202 13:40:36.575395 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:36 crc kubenswrapper[4721]: I0202 13:40:36.640560 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:36 crc kubenswrapper[4721]: I0202 13:40:36.815812 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.234112 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-grdzv"] Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.234959 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-grdzv" podUID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerName="registry-server" containerID="cri-o://278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb" gracePeriod=2 Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.783426 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.795646 4721 generic.go:334] "Generic (PLEG): container finished" podID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerID="278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb" exitCode=0 Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.795683 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grdzv" event={"ID":"fd185aef-3d37-4ad6-91f6-1470e8c39999","Type":"ContainerDied","Data":"278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb"} Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.795706 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-grdzv" event={"ID":"fd185aef-3d37-4ad6-91f6-1470e8c39999","Type":"ContainerDied","Data":"03f6acbdacdbf0004670972985b55556e86b1549a5f9eabecd04a5ea87091b90"} Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.795735 4721 scope.go:117] "RemoveContainer" containerID="278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.795784 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-grdzv" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.812460 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-catalog-content\") pod \"fd185aef-3d37-4ad6-91f6-1470e8c39999\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.813030 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-utilities\") pod \"fd185aef-3d37-4ad6-91f6-1470e8c39999\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.813058 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt7br\" (UniqueName: \"kubernetes.io/projected/fd185aef-3d37-4ad6-91f6-1470e8c39999-kube-api-access-kt7br\") pod \"fd185aef-3d37-4ad6-91f6-1470e8c39999\" (UID: \"fd185aef-3d37-4ad6-91f6-1470e8c39999\") " Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.823779 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-utilities" (OuterVolumeSpecName: "utilities") pod "fd185aef-3d37-4ad6-91f6-1470e8c39999" (UID: "fd185aef-3d37-4ad6-91f6-1470e8c39999"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.855597 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd185aef-3d37-4ad6-91f6-1470e8c39999-kube-api-access-kt7br" (OuterVolumeSpecName: "kube-api-access-kt7br") pod "fd185aef-3d37-4ad6-91f6-1470e8c39999" (UID: "fd185aef-3d37-4ad6-91f6-1470e8c39999"). InnerVolumeSpecName "kube-api-access-kt7br". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.862519 4721 scope.go:117] "RemoveContainer" containerID="e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.900355 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd185aef-3d37-4ad6-91f6-1470e8c39999" (UID: "fd185aef-3d37-4ad6-91f6-1470e8c39999"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.906815 4721 scope.go:117] "RemoveContainer" containerID="eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.918343 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.918383 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd185aef-3d37-4ad6-91f6-1470e8c39999-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.918430 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt7br\" (UniqueName: \"kubernetes.io/projected/fd185aef-3d37-4ad6-91f6-1470e8c39999-kube-api-access-kt7br\") on node \"crc\" DevicePath \"\"" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.958651 4721 scope.go:117] "RemoveContainer" containerID="278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb" Feb 02 13:40:40 crc kubenswrapper[4721]: E0202 13:40:40.959178 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb\": container with ID starting with 278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb not found: ID does not exist" containerID="278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.959266 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb"} err="failed to get container status \"278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb\": rpc error: code = NotFound desc = could not find container \"278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb\": container with ID starting with 278550fd60063b49953c40090a90bc342ae6b827a0bf1de59628401a7700c1fb not found: ID does not exist" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.959304 4721 scope.go:117] "RemoveContainer" containerID="e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084" Feb 02 13:40:40 crc kubenswrapper[4721]: E0202 13:40:40.959907 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084\": container with ID starting with e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084 not found: ID does not exist" containerID="e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.959948 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084"} err="failed to get container status \"e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084\": rpc error: code = NotFound desc = could not find container \"e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084\": container with ID starting with e0c93ba403bb3249e362943f6bbbde5f04706709dac91f6ae4e753f375727084 not found: ID does not exist" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.959997 4721 scope.go:117] "RemoveContainer" containerID="eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20" Feb 02 13:40:40 crc kubenswrapper[4721]: E0202 13:40:40.960541 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20\": container with ID starting with eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20 not found: ID does not exist" containerID="eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20" Feb 02 13:40:40 crc kubenswrapper[4721]: I0202 13:40:40.960569 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20"} err="failed to get container status \"eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20\": rpc error: code = NotFound desc = could not find container \"eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20\": container with ID starting with eb2c414ab7a095044068fc325ff9cdd4824dcdcf846d8f08971020fc8acf2a20 not found: ID does not exist" Feb 02 13:40:41 crc kubenswrapper[4721]: I0202 13:40:41.137246 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-grdzv"] Feb 02 13:40:41 crc kubenswrapper[4721]: I0202 13:40:41.147736 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-grdzv"] Feb 02 13:40:42 crc kubenswrapper[4721]: I0202 13:40:42.427923 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd185aef-3d37-4ad6-91f6-1470e8c39999" path="/var/lib/kubelet/pods/fd185aef-3d37-4ad6-91f6-1470e8c39999/volumes" Feb 02 13:40:44 crc kubenswrapper[4721]: I0202 13:40:44.763718 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:40:44 crc kubenswrapper[4721]: I0202 13:40:44.764227 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:41:14 crc kubenswrapper[4721]: I0202 13:41:14.763482 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:41:14 crc kubenswrapper[4721]: I0202 13:41:14.764200 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:41:14 crc kubenswrapper[4721]: I0202 13:41:14.764266 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:41:14 crc kubenswrapper[4721]: I0202 13:41:14.765548 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:41:14 crc kubenswrapper[4721]: I0202 13:41:14.765661 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" gracePeriod=600 Feb 02 13:41:14 crc kubenswrapper[4721]: E0202 13:41:14.911705 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:41:15 crc kubenswrapper[4721]: I0202 13:41:15.262887 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" exitCode=0 Feb 02 13:41:15 crc kubenswrapper[4721]: I0202 13:41:15.262930 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375"} Feb 02 13:41:15 crc kubenswrapper[4721]: I0202 13:41:15.262972 4721 scope.go:117] "RemoveContainer" containerID="8db65d587513bf606777ac46eb9a8dd677cd18af3524da4a78bb404625c5d58f" Feb 02 13:41:15 crc kubenswrapper[4721]: I0202 13:41:15.264205 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:41:15 crc kubenswrapper[4721]: E0202 13:41:15.264941 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:41:26 crc kubenswrapper[4721]: I0202 13:41:26.411403 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:41:26 crc kubenswrapper[4721]: E0202 13:41:26.412585 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:41:37 crc kubenswrapper[4721]: I0202 13:41:37.410438 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:41:37 crc kubenswrapper[4721]: E0202 13:41:37.411337 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:41:51 crc kubenswrapper[4721]: I0202 13:41:51.409815 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:41:51 crc kubenswrapper[4721]: E0202 13:41:51.412322 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:42:04 crc kubenswrapper[4721]: I0202 13:42:04.409983 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:42:04 crc kubenswrapper[4721]: E0202 13:42:04.411018 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:42:18 crc kubenswrapper[4721]: I0202 13:42:18.409259 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:42:18 crc kubenswrapper[4721]: E0202 13:42:18.410012 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:42:29 crc kubenswrapper[4721]: I0202 13:42:29.409406 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:42:29 crc kubenswrapper[4721]: E0202 13:42:29.410301 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:42:40 crc kubenswrapper[4721]: I0202 13:42:40.419089 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:42:40 crc kubenswrapper[4721]: E0202 13:42:40.420044 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:42:51 crc kubenswrapper[4721]: I0202 13:42:51.410479 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:42:51 crc kubenswrapper[4721]: E0202 13:42:51.411170 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:43:02 crc kubenswrapper[4721]: I0202 13:43:02.410294 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:43:02 crc kubenswrapper[4721]: E0202 13:43:02.411221 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:43:13 crc kubenswrapper[4721]: I0202 13:43:13.412021 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:43:13 crc kubenswrapper[4721]: E0202 13:43:13.413382 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:43:24 crc kubenswrapper[4721]: I0202 13:43:24.410559 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:43:24 crc kubenswrapper[4721]: E0202 13:43:24.412002 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:43:39 crc kubenswrapper[4721]: I0202 13:43:39.411192 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:43:39 crc kubenswrapper[4721]: E0202 13:43:39.411815 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:43:53 crc kubenswrapper[4721]: I0202 13:43:53.410830 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:43:53 crc kubenswrapper[4721]: E0202 13:43:53.415590 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:44:08 crc kubenswrapper[4721]: I0202 13:44:08.410952 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:44:08 crc kubenswrapper[4721]: E0202 13:44:08.411982 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:44:23 crc kubenswrapper[4721]: I0202 13:44:23.409818 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:44:23 crc kubenswrapper[4721]: E0202 13:44:23.410725 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.751240 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vzq62"] Feb 02 13:44:26 crc kubenswrapper[4721]: E0202 13:44:26.752645 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerName="extract-utilities" Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.752671 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerName="extract-utilities" Feb 02 13:44:26 crc kubenswrapper[4721]: E0202 13:44:26.752702 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerName="registry-server" Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.752714 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerName="registry-server" Feb 02 13:44:26 crc kubenswrapper[4721]: E0202 13:44:26.752751 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerName="extract-content" Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.752761 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerName="extract-content" Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.753206 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd185aef-3d37-4ad6-91f6-1470e8c39999" containerName="registry-server" Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.760873 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.775523 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vzq62"] Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.944147 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-catalog-content\") pod \"redhat-operators-vzq62\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.944458 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-utilities\") pod \"redhat-operators-vzq62\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:26 crc kubenswrapper[4721]: I0202 13:44:26.944602 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbz8s\" (UniqueName: \"kubernetes.io/projected/c7ad3a36-c7b7-42f9-a87a-773830064c68-kube-api-access-nbz8s\") pod \"redhat-operators-vzq62\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:27 crc kubenswrapper[4721]: I0202 13:44:27.047009 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbz8s\" (UniqueName: \"kubernetes.io/projected/c7ad3a36-c7b7-42f9-a87a-773830064c68-kube-api-access-nbz8s\") pod \"redhat-operators-vzq62\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:27 crc kubenswrapper[4721]: I0202 13:44:27.047166 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-catalog-content\") pod \"redhat-operators-vzq62\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:27 crc kubenswrapper[4721]: I0202 13:44:27.047191 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-utilities\") pod \"redhat-operators-vzq62\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:27 crc kubenswrapper[4721]: I0202 13:44:27.047716 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-utilities\") pod \"redhat-operators-vzq62\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:27 crc kubenswrapper[4721]: I0202 13:44:27.047804 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-catalog-content\") pod \"redhat-operators-vzq62\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:27 crc kubenswrapper[4721]: I0202 13:44:27.073386 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbz8s\" (UniqueName: \"kubernetes.io/projected/c7ad3a36-c7b7-42f9-a87a-773830064c68-kube-api-access-nbz8s\") pod \"redhat-operators-vzq62\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:27 crc kubenswrapper[4721]: I0202 13:44:27.099959 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:27 crc kubenswrapper[4721]: I0202 13:44:27.628296 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vzq62"] Feb 02 13:44:28 crc kubenswrapper[4721]: I0202 13:44:28.592241 4721 generic.go:334] "Generic (PLEG): container finished" podID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerID="6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e" exitCode=0 Feb 02 13:44:28 crc kubenswrapper[4721]: I0202 13:44:28.592283 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzq62" event={"ID":"c7ad3a36-c7b7-42f9-a87a-773830064c68","Type":"ContainerDied","Data":"6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e"} Feb 02 13:44:28 crc kubenswrapper[4721]: I0202 13:44:28.592566 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzq62" event={"ID":"c7ad3a36-c7b7-42f9-a87a-773830064c68","Type":"ContainerStarted","Data":"77b7d66f25e886f7db4f55994fba899900aceb23df2d0536848f46034d8e549e"} Feb 02 13:44:29 crc kubenswrapper[4721]: I0202 13:44:29.607305 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzq62" event={"ID":"c7ad3a36-c7b7-42f9-a87a-773830064c68","Type":"ContainerStarted","Data":"a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb"} Feb 02 13:44:34 crc kubenswrapper[4721]: I0202 13:44:34.678047 4721 generic.go:334] "Generic (PLEG): container finished" podID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerID="a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb" exitCode=0 Feb 02 13:44:34 crc kubenswrapper[4721]: I0202 13:44:34.678132 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzq62" event={"ID":"c7ad3a36-c7b7-42f9-a87a-773830064c68","Type":"ContainerDied","Data":"a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb"} Feb 02 13:44:35 crc kubenswrapper[4721]: I0202 13:44:35.409613 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:44:35 crc kubenswrapper[4721]: E0202 13:44:35.410448 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:44:35 crc kubenswrapper[4721]: I0202 13:44:35.701491 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzq62" event={"ID":"c7ad3a36-c7b7-42f9-a87a-773830064c68","Type":"ContainerStarted","Data":"09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8"} Feb 02 13:44:35 crc kubenswrapper[4721]: I0202 13:44:35.745416 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vzq62" podStartSLOduration=3.245523889 podStartE2EDuration="9.745387898s" podCreationTimestamp="2026-02-02 13:44:26 +0000 UTC" firstStartedPulling="2026-02-02 13:44:28.595258184 +0000 UTC m=+2608.897772573" lastFinishedPulling="2026-02-02 13:44:35.095122173 +0000 UTC m=+2615.397636582" observedRunningTime="2026-02-02 13:44:35.733721103 +0000 UTC m=+2616.036235492" watchObservedRunningTime="2026-02-02 13:44:35.745387898 +0000 UTC m=+2616.047902317" Feb 02 13:44:37 crc kubenswrapper[4721]: I0202 13:44:37.101291 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:37 crc kubenswrapper[4721]: I0202 13:44:37.101703 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:38 crc kubenswrapper[4721]: I0202 13:44:38.175405 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vzq62" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerName="registry-server" probeResult="failure" output=< Feb 02 13:44:38 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 13:44:38 crc kubenswrapper[4721]: > Feb 02 13:44:46 crc kubenswrapper[4721]: I0202 13:44:46.409903 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:44:46 crc kubenswrapper[4721]: E0202 13:44:46.410835 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:44:47 crc kubenswrapper[4721]: I0202 13:44:47.164955 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:47 crc kubenswrapper[4721]: I0202 13:44:47.258272 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:47 crc kubenswrapper[4721]: I0202 13:44:47.425183 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vzq62"] Feb 02 13:44:48 crc kubenswrapper[4721]: I0202 13:44:48.866288 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vzq62" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerName="registry-server" containerID="cri-o://09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8" gracePeriod=2 Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.491243 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.650835 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbz8s\" (UniqueName: \"kubernetes.io/projected/c7ad3a36-c7b7-42f9-a87a-773830064c68-kube-api-access-nbz8s\") pod \"c7ad3a36-c7b7-42f9-a87a-773830064c68\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.651148 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-catalog-content\") pod \"c7ad3a36-c7b7-42f9-a87a-773830064c68\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.651281 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-utilities\") pod \"c7ad3a36-c7b7-42f9-a87a-773830064c68\" (UID: \"c7ad3a36-c7b7-42f9-a87a-773830064c68\") " Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.652026 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-utilities" (OuterVolumeSpecName: "utilities") pod "c7ad3a36-c7b7-42f9-a87a-773830064c68" (UID: "c7ad3a36-c7b7-42f9-a87a-773830064c68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.662966 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ad3a36-c7b7-42f9-a87a-773830064c68-kube-api-access-nbz8s" (OuterVolumeSpecName: "kube-api-access-nbz8s") pod "c7ad3a36-c7b7-42f9-a87a-773830064c68" (UID: "c7ad3a36-c7b7-42f9-a87a-773830064c68"). InnerVolumeSpecName "kube-api-access-nbz8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.754583 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.754619 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbz8s\" (UniqueName: \"kubernetes.io/projected/c7ad3a36-c7b7-42f9-a87a-773830064c68-kube-api-access-nbz8s\") on node \"crc\" DevicePath \"\"" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.784797 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7ad3a36-c7b7-42f9-a87a-773830064c68" (UID: "c7ad3a36-c7b7-42f9-a87a-773830064c68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.857047 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7ad3a36-c7b7-42f9-a87a-773830064c68-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.878422 4721 generic.go:334] "Generic (PLEG): container finished" podID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerID="09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8" exitCode=0 Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.878462 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzq62" event={"ID":"c7ad3a36-c7b7-42f9-a87a-773830064c68","Type":"ContainerDied","Data":"09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8"} Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.878505 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzq62" event={"ID":"c7ad3a36-c7b7-42f9-a87a-773830064c68","Type":"ContainerDied","Data":"77b7d66f25e886f7db4f55994fba899900aceb23df2d0536848f46034d8e549e"} Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.878527 4721 scope.go:117] "RemoveContainer" containerID="09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.878527 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzq62" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.915593 4721 scope.go:117] "RemoveContainer" containerID="a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb" Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.921489 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vzq62"] Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.932100 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vzq62"] Feb 02 13:44:49 crc kubenswrapper[4721]: I0202 13:44:49.956502 4721 scope.go:117] "RemoveContainer" containerID="6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e" Feb 02 13:44:50 crc kubenswrapper[4721]: I0202 13:44:50.020614 4721 scope.go:117] "RemoveContainer" containerID="09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8" Feb 02 13:44:50 crc kubenswrapper[4721]: E0202 13:44:50.021076 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8\": container with ID starting with 09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8 not found: ID does not exist" containerID="09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8" Feb 02 13:44:50 crc kubenswrapper[4721]: I0202 13:44:50.021109 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8"} err="failed to get container status \"09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8\": rpc error: code = NotFound desc = could not find container \"09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8\": container with ID starting with 09ad1715d2346f747a6d653972cea32abcdad8091f3d1786f97d4c5c0bdd3ce8 not found: ID does not exist" Feb 02 13:44:50 crc kubenswrapper[4721]: I0202 13:44:50.021129 4721 scope.go:117] "RemoveContainer" containerID="a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb" Feb 02 13:44:50 crc kubenswrapper[4721]: E0202 13:44:50.021598 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb\": container with ID starting with a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb not found: ID does not exist" containerID="a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb" Feb 02 13:44:50 crc kubenswrapper[4721]: I0202 13:44:50.021672 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb"} err="failed to get container status \"a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb\": rpc error: code = NotFound desc = could not find container \"a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb\": container with ID starting with a2fe9cbea688d61acdba1a4b24bb3ac1862820b63ba60d3eb6428d91935c2ceb not found: ID does not exist" Feb 02 13:44:50 crc kubenswrapper[4721]: I0202 13:44:50.021717 4721 scope.go:117] "RemoveContainer" containerID="6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e" Feb 02 13:44:50 crc kubenswrapper[4721]: E0202 13:44:50.022429 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e\": container with ID starting with 6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e not found: ID does not exist" containerID="6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e" Feb 02 13:44:50 crc kubenswrapper[4721]: I0202 13:44:50.022459 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e"} err="failed to get container status \"6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e\": rpc error: code = NotFound desc = could not find container \"6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e\": container with ID starting with 6ed6ebdce7dc1f0c63d36ffc0347489abd3b475d4ec5370a068a0f2d8d8aed2e not found: ID does not exist" Feb 02 13:44:50 crc kubenswrapper[4721]: I0202 13:44:50.436157 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" path="/var/lib/kubelet/pods/c7ad3a36-c7b7-42f9-a87a-773830064c68/volumes" Feb 02 13:44:59 crc kubenswrapper[4721]: I0202 13:44:59.410720 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:44:59 crc kubenswrapper[4721]: E0202 13:44:59.413989 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.164453 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh"] Feb 02 13:45:00 crc kubenswrapper[4721]: E0202 13:45:00.164940 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerName="extract-content" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.164957 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerName="extract-content" Feb 02 13:45:00 crc kubenswrapper[4721]: E0202 13:45:00.164969 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerName="extract-utilities" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.164975 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerName="extract-utilities" Feb 02 13:45:00 crc kubenswrapper[4721]: E0202 13:45:00.164998 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerName="registry-server" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.165006 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerName="registry-server" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.165247 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ad3a36-c7b7-42f9-a87a-773830064c68" containerName="registry-server" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.166238 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.170469 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.171994 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.181079 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh"] Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.335387 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-config-volume\") pod \"collect-profiles-29500665-gnnxh\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.335428 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbw5l\" (UniqueName: \"kubernetes.io/projected/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-kube-api-access-lbw5l\") pod \"collect-profiles-29500665-gnnxh\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.335956 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-secret-volume\") pod \"collect-profiles-29500665-gnnxh\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.437915 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-secret-volume\") pod \"collect-profiles-29500665-gnnxh\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.438037 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-config-volume\") pod \"collect-profiles-29500665-gnnxh\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.438054 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbw5l\" (UniqueName: \"kubernetes.io/projected/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-kube-api-access-lbw5l\") pod \"collect-profiles-29500665-gnnxh\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.440818 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.450770 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-config-volume\") pod \"collect-profiles-29500665-gnnxh\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.460388 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-secret-volume\") pod \"collect-profiles-29500665-gnnxh\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.461154 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbw5l\" (UniqueName: \"kubernetes.io/projected/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-kube-api-access-lbw5l\") pod \"collect-profiles-29500665-gnnxh\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.502767 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 13:45:00 crc kubenswrapper[4721]: I0202 13:45:00.510882 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:01 crc kubenswrapper[4721]: I0202 13:45:01.120728 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh"] Feb 02 13:45:01 crc kubenswrapper[4721]: I0202 13:45:01.145186 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" event={"ID":"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0","Type":"ContainerStarted","Data":"205ede0bed5fd25627359c20802c03d6484bd8e30b4d9b9a947691ffeb7d8f21"} Feb 02 13:45:02 crc kubenswrapper[4721]: I0202 13:45:02.155752 4721 generic.go:334] "Generic (PLEG): container finished" podID="0c1f0398-e18b-44f0-b0a8-21f2de8af4d0" containerID="65b28cee5d7aba79bd5fdc801054328d6006f1cbcc896f6fdd692cc1c3bf2690" exitCode=0 Feb 02 13:45:02 crc kubenswrapper[4721]: I0202 13:45:02.155886 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" event={"ID":"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0","Type":"ContainerDied","Data":"65b28cee5d7aba79bd5fdc801054328d6006f1cbcc896f6fdd692cc1c3bf2690"} Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.653767 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.745441 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-config-volume\") pod \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.745580 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-secret-volume\") pod \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.745685 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbw5l\" (UniqueName: \"kubernetes.io/projected/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-kube-api-access-lbw5l\") pod \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\" (UID: \"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0\") " Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.746414 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "0c1f0398-e18b-44f0-b0a8-21f2de8af4d0" (UID: "0c1f0398-e18b-44f0-b0a8-21f2de8af4d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.751348 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0c1f0398-e18b-44f0-b0a8-21f2de8af4d0" (UID: "0c1f0398-e18b-44f0-b0a8-21f2de8af4d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.751402 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-kube-api-access-lbw5l" (OuterVolumeSpecName: "kube-api-access-lbw5l") pod "0c1f0398-e18b-44f0-b0a8-21f2de8af4d0" (UID: "0c1f0398-e18b-44f0-b0a8-21f2de8af4d0"). InnerVolumeSpecName "kube-api-access-lbw5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.848594 4721 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.848627 4721 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:45:03 crc kubenswrapper[4721]: I0202 13:45:03.848638 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbw5l\" (UniqueName: \"kubernetes.io/projected/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0-kube-api-access-lbw5l\") on node \"crc\" DevicePath \"\"" Feb 02 13:45:04 crc kubenswrapper[4721]: I0202 13:45:04.181670 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" event={"ID":"0c1f0398-e18b-44f0-b0a8-21f2de8af4d0","Type":"ContainerDied","Data":"205ede0bed5fd25627359c20802c03d6484bd8e30b4d9b9a947691ffeb7d8f21"} Feb 02 13:45:04 crc kubenswrapper[4721]: I0202 13:45:04.181714 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="205ede0bed5fd25627359c20802c03d6484bd8e30b4d9b9a947691ffeb7d8f21" Feb 02 13:45:04 crc kubenswrapper[4721]: I0202 13:45:04.181730 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh" Feb 02 13:45:04 crc kubenswrapper[4721]: I0202 13:45:04.752619 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg"] Feb 02 13:45:04 crc kubenswrapper[4721]: I0202 13:45:04.762169 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500620-rxhcg"] Feb 02 13:45:06 crc kubenswrapper[4721]: I0202 13:45:06.422020 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873a8c0c-9da4-4619-9ebf-7a327eb22b7e" path="/var/lib/kubelet/pods/873a8c0c-9da4-4619-9ebf-7a327eb22b7e/volumes" Feb 02 13:45:13 crc kubenswrapper[4721]: I0202 13:45:13.969797 4721 scope.go:117] "RemoveContainer" containerID="e5df09a9819f06ce199926abbd018fc8f1a3ae0cbae66765ccb3d7e1c3a1a81c" Feb 02 13:45:15 crc kubenswrapper[4721]: I0202 13:45:15.410508 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:45:15 crc kubenswrapper[4721]: E0202 13:45:15.411427 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.607633 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9hmgc"] Feb 02 13:45:28 crc kubenswrapper[4721]: E0202 13:45:28.608544 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1f0398-e18b-44f0-b0a8-21f2de8af4d0" containerName="collect-profiles" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.608556 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1f0398-e18b-44f0-b0a8-21f2de8af4d0" containerName="collect-profiles" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.608791 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c1f0398-e18b-44f0-b0a8-21f2de8af4d0" containerName="collect-profiles" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.611260 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.629621 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hmgc"] Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.763014 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbw72\" (UniqueName: \"kubernetes.io/projected/5dcec122-4dce-4cdd-ad6e-24defada74b1-kube-api-access-tbw72\") pod \"community-operators-9hmgc\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.763192 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-catalog-content\") pod \"community-operators-9hmgc\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.763357 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-utilities\") pod \"community-operators-9hmgc\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.865726 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbw72\" (UniqueName: \"kubernetes.io/projected/5dcec122-4dce-4cdd-ad6e-24defada74b1-kube-api-access-tbw72\") pod \"community-operators-9hmgc\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.866164 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-catalog-content\") pod \"community-operators-9hmgc\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.866266 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-utilities\") pod \"community-operators-9hmgc\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.866694 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-catalog-content\") pod \"community-operators-9hmgc\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.866852 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-utilities\") pod \"community-operators-9hmgc\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.893129 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbw72\" (UniqueName: \"kubernetes.io/projected/5dcec122-4dce-4cdd-ad6e-24defada74b1-kube-api-access-tbw72\") pod \"community-operators-9hmgc\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:28 crc kubenswrapper[4721]: I0202 13:45:28.935399 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:29 crc kubenswrapper[4721]: I0202 13:45:29.475179 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hmgc"] Feb 02 13:45:30 crc kubenswrapper[4721]: I0202 13:45:30.427980 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:45:30 crc kubenswrapper[4721]: E0202 13:45:30.429292 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:45:30 crc kubenswrapper[4721]: I0202 13:45:30.485454 4721 generic.go:334] "Generic (PLEG): container finished" podID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerID="0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0" exitCode=0 Feb 02 13:45:30 crc kubenswrapper[4721]: I0202 13:45:30.485506 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hmgc" event={"ID":"5dcec122-4dce-4cdd-ad6e-24defada74b1","Type":"ContainerDied","Data":"0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0"} Feb 02 13:45:30 crc kubenswrapper[4721]: I0202 13:45:30.485534 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hmgc" event={"ID":"5dcec122-4dce-4cdd-ad6e-24defada74b1","Type":"ContainerStarted","Data":"43cd22e32e4c677a4e0ac7ef0068bea7db0f78bbfcb1330f8a9bfd2c11936245"} Feb 02 13:45:30 crc kubenswrapper[4721]: I0202 13:45:30.488870 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:45:32 crc kubenswrapper[4721]: I0202 13:45:32.507702 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hmgc" event={"ID":"5dcec122-4dce-4cdd-ad6e-24defada74b1","Type":"ContainerStarted","Data":"4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3"} Feb 02 13:45:34 crc kubenswrapper[4721]: I0202 13:45:34.531645 4721 generic.go:334] "Generic (PLEG): container finished" podID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerID="4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3" exitCode=0 Feb 02 13:45:34 crc kubenswrapper[4721]: I0202 13:45:34.531707 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hmgc" event={"ID":"5dcec122-4dce-4cdd-ad6e-24defada74b1","Type":"ContainerDied","Data":"4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3"} Feb 02 13:45:35 crc kubenswrapper[4721]: I0202 13:45:35.553444 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hmgc" event={"ID":"5dcec122-4dce-4cdd-ad6e-24defada74b1","Type":"ContainerStarted","Data":"aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2"} Feb 02 13:45:35 crc kubenswrapper[4721]: I0202 13:45:35.586805 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9hmgc" podStartSLOduration=3.059573766 podStartE2EDuration="7.586778682s" podCreationTimestamp="2026-02-02 13:45:28 +0000 UTC" firstStartedPulling="2026-02-02 13:45:30.488550259 +0000 UTC m=+2670.791064648" lastFinishedPulling="2026-02-02 13:45:35.015755175 +0000 UTC m=+2675.318269564" observedRunningTime="2026-02-02 13:45:35.580677317 +0000 UTC m=+2675.883191766" watchObservedRunningTime="2026-02-02 13:45:35.586778682 +0000 UTC m=+2675.889293081" Feb 02 13:45:38 crc kubenswrapper[4721]: I0202 13:45:38.936289 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:38 crc kubenswrapper[4721]: I0202 13:45:38.936822 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:39 crc kubenswrapper[4721]: I0202 13:45:39.005361 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:44 crc kubenswrapper[4721]: I0202 13:45:44.410106 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:45:44 crc kubenswrapper[4721]: E0202 13:45:44.411102 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:45:48 crc kubenswrapper[4721]: I0202 13:45:48.989172 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:49 crc kubenswrapper[4721]: I0202 13:45:49.038956 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hmgc"] Feb 02 13:45:49 crc kubenswrapper[4721]: I0202 13:45:49.686745 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9hmgc" podUID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerName="registry-server" containerID="cri-o://aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2" gracePeriod=2 Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.224248 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.308692 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-catalog-content\") pod \"5dcec122-4dce-4cdd-ad6e-24defada74b1\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.309049 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbw72\" (UniqueName: \"kubernetes.io/projected/5dcec122-4dce-4cdd-ad6e-24defada74b1-kube-api-access-tbw72\") pod \"5dcec122-4dce-4cdd-ad6e-24defada74b1\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.309223 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-utilities\") pod \"5dcec122-4dce-4cdd-ad6e-24defada74b1\" (UID: \"5dcec122-4dce-4cdd-ad6e-24defada74b1\") " Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.310298 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-utilities" (OuterVolumeSpecName: "utilities") pod "5dcec122-4dce-4cdd-ad6e-24defada74b1" (UID: "5dcec122-4dce-4cdd-ad6e-24defada74b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.314940 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dcec122-4dce-4cdd-ad6e-24defada74b1-kube-api-access-tbw72" (OuterVolumeSpecName: "kube-api-access-tbw72") pod "5dcec122-4dce-4cdd-ad6e-24defada74b1" (UID: "5dcec122-4dce-4cdd-ad6e-24defada74b1"). InnerVolumeSpecName "kube-api-access-tbw72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.364847 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5dcec122-4dce-4cdd-ad6e-24defada74b1" (UID: "5dcec122-4dce-4cdd-ad6e-24defada74b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.412359 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.412388 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5dcec122-4dce-4cdd-ad6e-24defada74b1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.412400 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbw72\" (UniqueName: \"kubernetes.io/projected/5dcec122-4dce-4cdd-ad6e-24defada74b1-kube-api-access-tbw72\") on node \"crc\" DevicePath \"\"" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.700655 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hmgc" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.700691 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hmgc" event={"ID":"5dcec122-4dce-4cdd-ad6e-24defada74b1","Type":"ContainerDied","Data":"aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2"} Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.701248 4721 scope.go:117] "RemoveContainer" containerID="aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.700623 4721 generic.go:334] "Generic (PLEG): container finished" podID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerID="aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2" exitCode=0 Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.701355 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hmgc" event={"ID":"5dcec122-4dce-4cdd-ad6e-24defada74b1","Type":"ContainerDied","Data":"43cd22e32e4c677a4e0ac7ef0068bea7db0f78bbfcb1330f8a9bfd2c11936245"} Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.730458 4721 scope.go:117] "RemoveContainer" containerID="4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.736937 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hmgc"] Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.750613 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9hmgc"] Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.775634 4721 scope.go:117] "RemoveContainer" containerID="0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.839216 4721 scope.go:117] "RemoveContainer" containerID="aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2" Feb 02 13:45:50 crc kubenswrapper[4721]: E0202 13:45:50.839701 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2\": container with ID starting with aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2 not found: ID does not exist" containerID="aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.839751 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2"} err="failed to get container status \"aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2\": rpc error: code = NotFound desc = could not find container \"aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2\": container with ID starting with aaf621df543f587a9ca28561b98c44ff9f550b096b446211468f028715a41de2 not found: ID does not exist" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.839787 4721 scope.go:117] "RemoveContainer" containerID="4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3" Feb 02 13:45:50 crc kubenswrapper[4721]: E0202 13:45:50.840491 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3\": container with ID starting with 4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3 not found: ID does not exist" containerID="4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.840515 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3"} err="failed to get container status \"4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3\": rpc error: code = NotFound desc = could not find container \"4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3\": container with ID starting with 4e56addc09f5ed3e298095888ff275b600501427656ee7ae5fa23269c5facfb3 not found: ID does not exist" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.840532 4721 scope.go:117] "RemoveContainer" containerID="0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0" Feb 02 13:45:50 crc kubenswrapper[4721]: E0202 13:45:50.840735 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0\": container with ID starting with 0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0 not found: ID does not exist" containerID="0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0" Feb 02 13:45:50 crc kubenswrapper[4721]: I0202 13:45:50.840754 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0"} err="failed to get container status \"0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0\": rpc error: code = NotFound desc = could not find container \"0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0\": container with ID starting with 0e7cad1ebd7f7a6d05f8327943a512f5f423e3558fe92dd60c9c32c61be8d8d0 not found: ID does not exist" Feb 02 13:45:52 crc kubenswrapper[4721]: I0202 13:45:52.423707 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dcec122-4dce-4cdd-ad6e-24defada74b1" path="/var/lib/kubelet/pods/5dcec122-4dce-4cdd-ad6e-24defada74b1/volumes" Feb 02 13:45:57 crc kubenswrapper[4721]: I0202 13:45:57.410519 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:45:57 crc kubenswrapper[4721]: E0202 13:45:57.411385 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:46:08 crc kubenswrapper[4721]: I0202 13:46:08.410116 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:46:08 crc kubenswrapper[4721]: E0202 13:46:08.410845 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:46:20 crc kubenswrapper[4721]: I0202 13:46:20.425336 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:46:21 crc kubenswrapper[4721]: I0202 13:46:21.026272 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"8408071901f7ef7e9399e59808de53d109544919537a0777d33359b2080a1dcb"} Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.056774 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lbt8t"] Feb 02 13:47:56 crc kubenswrapper[4721]: E0202 13:47:56.057853 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerName="extract-content" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.057870 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerName="extract-content" Feb 02 13:47:56 crc kubenswrapper[4721]: E0202 13:47:56.057909 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerName="registry-server" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.057915 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerName="registry-server" Feb 02 13:47:56 crc kubenswrapper[4721]: E0202 13:47:56.057935 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerName="extract-utilities" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.057941 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerName="extract-utilities" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.058204 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dcec122-4dce-4cdd-ad6e-24defada74b1" containerName="registry-server" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.060219 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.088671 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbt8t"] Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.193839 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-utilities\") pod \"redhat-marketplace-lbt8t\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.193955 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-catalog-content\") pod \"redhat-marketplace-lbt8t\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.194235 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gv2r\" (UniqueName: \"kubernetes.io/projected/3deafa69-5966-4e1f-980e-425af72acdc0-kube-api-access-5gv2r\") pod \"redhat-marketplace-lbt8t\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.296382 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gv2r\" (UniqueName: \"kubernetes.io/projected/3deafa69-5966-4e1f-980e-425af72acdc0-kube-api-access-5gv2r\") pod \"redhat-marketplace-lbt8t\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.296528 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-utilities\") pod \"redhat-marketplace-lbt8t\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.296581 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-catalog-content\") pod \"redhat-marketplace-lbt8t\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.297056 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-catalog-content\") pod \"redhat-marketplace-lbt8t\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.297311 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-utilities\") pod \"redhat-marketplace-lbt8t\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.315877 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gv2r\" (UniqueName: \"kubernetes.io/projected/3deafa69-5966-4e1f-980e-425af72acdc0-kube-api-access-5gv2r\") pod \"redhat-marketplace-lbt8t\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:56 crc kubenswrapper[4721]: I0202 13:47:56.385803 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:47:57 crc kubenswrapper[4721]: I0202 13:47:56.999919 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbt8t"] Feb 02 13:47:57 crc kubenswrapper[4721]: I0202 13:47:57.070687 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbt8t" event={"ID":"3deafa69-5966-4e1f-980e-425af72acdc0","Type":"ContainerStarted","Data":"6202bf2428b0781b43b75d7377d94c51ff868d6ddd6346141789a203072b4804"} Feb 02 13:47:58 crc kubenswrapper[4721]: I0202 13:47:58.097204 4721 generic.go:334] "Generic (PLEG): container finished" podID="3deafa69-5966-4e1f-980e-425af72acdc0" containerID="2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556" exitCode=0 Feb 02 13:47:58 crc kubenswrapper[4721]: I0202 13:47:58.097380 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbt8t" event={"ID":"3deafa69-5966-4e1f-980e-425af72acdc0","Type":"ContainerDied","Data":"2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556"} Feb 02 13:48:00 crc kubenswrapper[4721]: I0202 13:48:00.121959 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbt8t" event={"ID":"3deafa69-5966-4e1f-980e-425af72acdc0","Type":"ContainerStarted","Data":"612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea"} Feb 02 13:48:01 crc kubenswrapper[4721]: I0202 13:48:01.137056 4721 generic.go:334] "Generic (PLEG): container finished" podID="3deafa69-5966-4e1f-980e-425af72acdc0" containerID="612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea" exitCode=0 Feb 02 13:48:01 crc kubenswrapper[4721]: I0202 13:48:01.137129 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbt8t" event={"ID":"3deafa69-5966-4e1f-980e-425af72acdc0","Type":"ContainerDied","Data":"612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea"} Feb 02 13:48:02 crc kubenswrapper[4721]: I0202 13:48:02.173234 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbt8t" event={"ID":"3deafa69-5966-4e1f-980e-425af72acdc0","Type":"ContainerStarted","Data":"10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051"} Feb 02 13:48:02 crc kubenswrapper[4721]: I0202 13:48:02.199388 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lbt8t" podStartSLOduration=2.510048068 podStartE2EDuration="6.199369518s" podCreationTimestamp="2026-02-02 13:47:56 +0000 UTC" firstStartedPulling="2026-02-02 13:47:58.101108325 +0000 UTC m=+2818.403622714" lastFinishedPulling="2026-02-02 13:48:01.790429775 +0000 UTC m=+2822.092944164" observedRunningTime="2026-02-02 13:48:02.194596499 +0000 UTC m=+2822.497110938" watchObservedRunningTime="2026-02-02 13:48:02.199369518 +0000 UTC m=+2822.501883907" Feb 02 13:48:06 crc kubenswrapper[4721]: I0202 13:48:06.386296 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:48:06 crc kubenswrapper[4721]: I0202 13:48:06.386925 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:48:06 crc kubenswrapper[4721]: I0202 13:48:06.459429 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:48:07 crc kubenswrapper[4721]: I0202 13:48:07.297725 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:48:07 crc kubenswrapper[4721]: I0202 13:48:07.363784 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbt8t"] Feb 02 13:48:09 crc kubenswrapper[4721]: I0202 13:48:09.253848 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lbt8t" podUID="3deafa69-5966-4e1f-980e-425af72acdc0" containerName="registry-server" containerID="cri-o://10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051" gracePeriod=2 Feb 02 13:48:09 crc kubenswrapper[4721]: I0202 13:48:09.810365 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:48:09 crc kubenswrapper[4721]: I0202 13:48:09.938187 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gv2r\" (UniqueName: \"kubernetes.io/projected/3deafa69-5966-4e1f-980e-425af72acdc0-kube-api-access-5gv2r\") pod \"3deafa69-5966-4e1f-980e-425af72acdc0\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " Feb 02 13:48:09 crc kubenswrapper[4721]: I0202 13:48:09.938287 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-catalog-content\") pod \"3deafa69-5966-4e1f-980e-425af72acdc0\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " Feb 02 13:48:09 crc kubenswrapper[4721]: I0202 13:48:09.938391 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-utilities\") pod \"3deafa69-5966-4e1f-980e-425af72acdc0\" (UID: \"3deafa69-5966-4e1f-980e-425af72acdc0\") " Feb 02 13:48:09 crc kubenswrapper[4721]: I0202 13:48:09.940541 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-utilities" (OuterVolumeSpecName: "utilities") pod "3deafa69-5966-4e1f-980e-425af72acdc0" (UID: "3deafa69-5966-4e1f-980e-425af72acdc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:48:09 crc kubenswrapper[4721]: I0202 13:48:09.945402 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3deafa69-5966-4e1f-980e-425af72acdc0-kube-api-access-5gv2r" (OuterVolumeSpecName: "kube-api-access-5gv2r") pod "3deafa69-5966-4e1f-980e-425af72acdc0" (UID: "3deafa69-5966-4e1f-980e-425af72acdc0"). InnerVolumeSpecName "kube-api-access-5gv2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:48:09 crc kubenswrapper[4721]: I0202 13:48:09.961426 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3deafa69-5966-4e1f-980e-425af72acdc0" (UID: "3deafa69-5966-4e1f-980e-425af72acdc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.041559 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.041617 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gv2r\" (UniqueName: \"kubernetes.io/projected/3deafa69-5966-4e1f-980e-425af72acdc0-kube-api-access-5gv2r\") on node \"crc\" DevicePath \"\"" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.041630 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3deafa69-5966-4e1f-980e-425af72acdc0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.267386 4721 generic.go:334] "Generic (PLEG): container finished" podID="3deafa69-5966-4e1f-980e-425af72acdc0" containerID="10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051" exitCode=0 Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.267435 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbt8t" event={"ID":"3deafa69-5966-4e1f-980e-425af72acdc0","Type":"ContainerDied","Data":"10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051"} Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.267483 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbt8t" event={"ID":"3deafa69-5966-4e1f-980e-425af72acdc0","Type":"ContainerDied","Data":"6202bf2428b0781b43b75d7377d94c51ff868d6ddd6346141789a203072b4804"} Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.267505 4721 scope.go:117] "RemoveContainer" containerID="10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.267523 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbt8t" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.294466 4721 scope.go:117] "RemoveContainer" containerID="612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.316232 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbt8t"] Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.325132 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbt8t"] Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.338843 4721 scope.go:117] "RemoveContainer" containerID="2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.397110 4721 scope.go:117] "RemoveContainer" containerID="10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051" Feb 02 13:48:10 crc kubenswrapper[4721]: E0202 13:48:10.397773 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051\": container with ID starting with 10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051 not found: ID does not exist" containerID="10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.397826 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051"} err="failed to get container status \"10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051\": rpc error: code = NotFound desc = could not find container \"10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051\": container with ID starting with 10a37277718872dfb34a51982c52a791fb47d31f2ee8f2faf4afd7c82e341051 not found: ID does not exist" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.397859 4721 scope.go:117] "RemoveContainer" containerID="612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea" Feb 02 13:48:10 crc kubenswrapper[4721]: E0202 13:48:10.398396 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea\": container with ID starting with 612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea not found: ID does not exist" containerID="612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.398435 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea"} err="failed to get container status \"612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea\": rpc error: code = NotFound desc = could not find container \"612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea\": container with ID starting with 612953a20e032543a014347153fd200146d412d83ed9394e0b870a25fbc7d9ea not found: ID does not exist" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.398453 4721 scope.go:117] "RemoveContainer" containerID="2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556" Feb 02 13:48:10 crc kubenswrapper[4721]: E0202 13:48:10.398790 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556\": container with ID starting with 2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556 not found: ID does not exist" containerID="2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.398891 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556"} err="failed to get container status \"2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556\": rpc error: code = NotFound desc = could not find container \"2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556\": container with ID starting with 2f3db2b82513b61552bc392d16adcdf1a3fb9403b2f840dcd35c94a8884b9556 not found: ID does not exist" Feb 02 13:48:10 crc kubenswrapper[4721]: I0202 13:48:10.429822 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3deafa69-5966-4e1f-980e-425af72acdc0" path="/var/lib/kubelet/pods/3deafa69-5966-4e1f-980e-425af72acdc0/volumes" Feb 02 13:48:44 crc kubenswrapper[4721]: I0202 13:48:44.763271 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:48:44 crc kubenswrapper[4721]: I0202 13:48:44.763862 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:49:14 crc kubenswrapper[4721]: I0202 13:49:14.763991 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:49:14 crc kubenswrapper[4721]: I0202 13:49:14.764953 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:49:44 crc kubenswrapper[4721]: I0202 13:49:44.763799 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:49:44 crc kubenswrapper[4721]: I0202 13:49:44.764637 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:49:44 crc kubenswrapper[4721]: I0202 13:49:44.764704 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:49:44 crc kubenswrapper[4721]: I0202 13:49:44.766298 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8408071901f7ef7e9399e59808de53d109544919537a0777d33359b2080a1dcb"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:49:44 crc kubenswrapper[4721]: I0202 13:49:44.766421 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://8408071901f7ef7e9399e59808de53d109544919537a0777d33359b2080a1dcb" gracePeriod=600 Feb 02 13:49:45 crc kubenswrapper[4721]: I0202 13:49:45.364559 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="8408071901f7ef7e9399e59808de53d109544919537a0777d33359b2080a1dcb" exitCode=0 Feb 02 13:49:45 crc kubenswrapper[4721]: I0202 13:49:45.364645 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"8408071901f7ef7e9399e59808de53d109544919537a0777d33359b2080a1dcb"} Feb 02 13:49:45 crc kubenswrapper[4721]: I0202 13:49:45.365000 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95"} Feb 02 13:49:45 crc kubenswrapper[4721]: I0202 13:49:45.365030 4721 scope.go:117] "RemoveContainer" containerID="d8958dd1d812f7b9a664fedaaa541ab06354a91335826977890ec60b381f7375" Feb 02 13:52:14 crc kubenswrapper[4721]: I0202 13:52:14.765017 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:52:14 crc kubenswrapper[4721]: I0202 13:52:14.765546 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:52:44 crc kubenswrapper[4721]: I0202 13:52:44.763442 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:52:44 crc kubenswrapper[4721]: I0202 13:52:44.764102 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:53:14 crc kubenswrapper[4721]: I0202 13:53:14.763214 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:53:14 crc kubenswrapper[4721]: I0202 13:53:14.763709 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:53:14 crc kubenswrapper[4721]: I0202 13:53:14.763752 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 13:53:14 crc kubenswrapper[4721]: I0202 13:53:14.764721 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:53:14 crc kubenswrapper[4721]: I0202 13:53:14.764769 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" gracePeriod=600 Feb 02 13:53:14 crc kubenswrapper[4721]: E0202 13:53:14.888185 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:53:15 crc kubenswrapper[4721]: I0202 13:53:15.261273 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" exitCode=0 Feb 02 13:53:15 crc kubenswrapper[4721]: I0202 13:53:15.261344 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95"} Feb 02 13:53:15 crc kubenswrapper[4721]: I0202 13:53:15.261420 4721 scope.go:117] "RemoveContainer" containerID="8408071901f7ef7e9399e59808de53d109544919537a0777d33359b2080a1dcb" Feb 02 13:53:15 crc kubenswrapper[4721]: I0202 13:53:15.262018 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:53:15 crc kubenswrapper[4721]: E0202 13:53:15.262451 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:53:28 crc kubenswrapper[4721]: I0202 13:53:28.409881 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:53:28 crc kubenswrapper[4721]: E0202 13:53:28.410650 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:53:40 crc kubenswrapper[4721]: I0202 13:53:40.419164 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:53:40 crc kubenswrapper[4721]: E0202 13:53:40.420255 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:53:51 crc kubenswrapper[4721]: I0202 13:53:51.409665 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:53:51 crc kubenswrapper[4721]: E0202 13:53:51.410570 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:54:05 crc kubenswrapper[4721]: I0202 13:54:05.427781 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:54:05 crc kubenswrapper[4721]: E0202 13:54:05.428673 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:54:20 crc kubenswrapper[4721]: I0202 13:54:20.423860 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:54:20 crc kubenswrapper[4721]: E0202 13:54:20.424946 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:54:32 crc kubenswrapper[4721]: I0202 13:54:32.417320 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:54:32 crc kubenswrapper[4721]: E0202 13:54:32.420240 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:54:46 crc kubenswrapper[4721]: I0202 13:54:46.410595 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:54:46 crc kubenswrapper[4721]: E0202 13:54:46.411866 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:54:57 crc kubenswrapper[4721]: I0202 13:54:57.409997 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:54:57 crc kubenswrapper[4721]: E0202 13:54:57.410977 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:55:12 crc kubenswrapper[4721]: I0202 13:55:12.409558 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:55:12 crc kubenswrapper[4721]: E0202 13:55:12.410438 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:55:23 crc kubenswrapper[4721]: I0202 13:55:23.409856 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:55:23 crc kubenswrapper[4721]: E0202 13:55:23.410724 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:55:34 crc kubenswrapper[4721]: I0202 13:55:34.413466 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:55:34 crc kubenswrapper[4721]: E0202 13:55:34.415449 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:55:45 crc kubenswrapper[4721]: I0202 13:55:45.409394 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:55:45 crc kubenswrapper[4721]: E0202 13:55:45.409976 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:56:00 crc kubenswrapper[4721]: I0202 13:56:00.420717 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:56:00 crc kubenswrapper[4721]: E0202 13:56:00.421644 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:56:14 crc kubenswrapper[4721]: I0202 13:56:14.411617 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:56:14 crc kubenswrapper[4721]: E0202 13:56:14.413723 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:56:25 crc kubenswrapper[4721]: I0202 13:56:25.410531 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:56:25 crc kubenswrapper[4721]: E0202 13:56:25.412680 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.825862 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pxrt4"] Feb 02 13:56:31 crc kubenswrapper[4721]: E0202 13:56:31.827191 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3deafa69-5966-4e1f-980e-425af72acdc0" containerName="extract-utilities" Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.827208 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3deafa69-5966-4e1f-980e-425af72acdc0" containerName="extract-utilities" Feb 02 13:56:31 crc kubenswrapper[4721]: E0202 13:56:31.827218 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3deafa69-5966-4e1f-980e-425af72acdc0" containerName="registry-server" Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.827227 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3deafa69-5966-4e1f-980e-425af72acdc0" containerName="registry-server" Feb 02 13:56:31 crc kubenswrapper[4721]: E0202 13:56:31.827257 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3deafa69-5966-4e1f-980e-425af72acdc0" containerName="extract-content" Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.827266 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3deafa69-5966-4e1f-980e-425af72acdc0" containerName="extract-content" Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.827859 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3deafa69-5966-4e1f-980e-425af72acdc0" containerName="registry-server" Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.830595 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.841189 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pxrt4"] Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.944869 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flg89\" (UniqueName: \"kubernetes.io/projected/6d185637-80f0-4145-b618-e2f865c63eae-kube-api-access-flg89\") pod \"community-operators-pxrt4\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.945123 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-catalog-content\") pod \"community-operators-pxrt4\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:31 crc kubenswrapper[4721]: I0202 13:56:31.945459 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-utilities\") pod \"community-operators-pxrt4\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:32 crc kubenswrapper[4721]: I0202 13:56:32.048764 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flg89\" (UniqueName: \"kubernetes.io/projected/6d185637-80f0-4145-b618-e2f865c63eae-kube-api-access-flg89\") pod \"community-operators-pxrt4\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:32 crc kubenswrapper[4721]: I0202 13:56:32.048879 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-catalog-content\") pod \"community-operators-pxrt4\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:32 crc kubenswrapper[4721]: I0202 13:56:32.048970 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-utilities\") pod \"community-operators-pxrt4\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:32 crc kubenswrapper[4721]: I0202 13:56:32.049412 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-catalog-content\") pod \"community-operators-pxrt4\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:32 crc kubenswrapper[4721]: I0202 13:56:32.049515 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-utilities\") pod \"community-operators-pxrt4\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:32 crc kubenswrapper[4721]: I0202 13:56:32.070595 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flg89\" (UniqueName: \"kubernetes.io/projected/6d185637-80f0-4145-b618-e2f865c63eae-kube-api-access-flg89\") pod \"community-operators-pxrt4\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:32 crc kubenswrapper[4721]: I0202 13:56:32.159475 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:32 crc kubenswrapper[4721]: I0202 13:56:32.723447 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pxrt4"] Feb 02 13:56:33 crc kubenswrapper[4721]: I0202 13:56:33.092750 4721 generic.go:334] "Generic (PLEG): container finished" podID="6d185637-80f0-4145-b618-e2f865c63eae" containerID="6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5" exitCode=0 Feb 02 13:56:33 crc kubenswrapper[4721]: I0202 13:56:33.092799 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxrt4" event={"ID":"6d185637-80f0-4145-b618-e2f865c63eae","Type":"ContainerDied","Data":"6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5"} Feb 02 13:56:33 crc kubenswrapper[4721]: I0202 13:56:33.092830 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxrt4" event={"ID":"6d185637-80f0-4145-b618-e2f865c63eae","Type":"ContainerStarted","Data":"d1497bf13ca863f87cd8aabfa72fe1a83cd44e4483f83f21ded2bf8ec4bd89d0"} Feb 02 13:56:33 crc kubenswrapper[4721]: I0202 13:56:33.095190 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:56:34 crc kubenswrapper[4721]: I0202 13:56:34.104044 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxrt4" event={"ID":"6d185637-80f0-4145-b618-e2f865c63eae","Type":"ContainerStarted","Data":"a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6"} Feb 02 13:56:35 crc kubenswrapper[4721]: I0202 13:56:35.126462 4721 generic.go:334] "Generic (PLEG): container finished" podID="6d185637-80f0-4145-b618-e2f865c63eae" containerID="a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6" exitCode=0 Feb 02 13:56:35 crc kubenswrapper[4721]: I0202 13:56:35.126534 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxrt4" event={"ID":"6d185637-80f0-4145-b618-e2f865c63eae","Type":"ContainerDied","Data":"a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6"} Feb 02 13:56:38 crc kubenswrapper[4721]: I0202 13:56:38.161435 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxrt4" event={"ID":"6d185637-80f0-4145-b618-e2f865c63eae","Type":"ContainerStarted","Data":"259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965"} Feb 02 13:56:38 crc kubenswrapper[4721]: I0202 13:56:38.193697 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pxrt4" podStartSLOduration=2.635322265 podStartE2EDuration="7.193680555s" podCreationTimestamp="2026-02-02 13:56:31 +0000 UTC" firstStartedPulling="2026-02-02 13:56:33.094703135 +0000 UTC m=+3333.397217544" lastFinishedPulling="2026-02-02 13:56:37.653061445 +0000 UTC m=+3337.955575834" observedRunningTime="2026-02-02 13:56:38.187664444 +0000 UTC m=+3338.490178833" watchObservedRunningTime="2026-02-02 13:56:38.193680555 +0000 UTC m=+3338.496194934" Feb 02 13:56:38 crc kubenswrapper[4721]: I0202 13:56:38.410129 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:56:38 crc kubenswrapper[4721]: E0202 13:56:38.410384 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:56:42 crc kubenswrapper[4721]: I0202 13:56:42.160176 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:42 crc kubenswrapper[4721]: I0202 13:56:42.160851 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:42 crc kubenswrapper[4721]: I0202 13:56:42.212984 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:42 crc kubenswrapper[4721]: I0202 13:56:42.268549 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:42 crc kubenswrapper[4721]: I0202 13:56:42.462278 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pxrt4"] Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.229207 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pxrt4" podUID="6d185637-80f0-4145-b618-e2f865c63eae" containerName="registry-server" containerID="cri-o://259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965" gracePeriod=2 Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.789365 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.878429 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flg89\" (UniqueName: \"kubernetes.io/projected/6d185637-80f0-4145-b618-e2f865c63eae-kube-api-access-flg89\") pod \"6d185637-80f0-4145-b618-e2f865c63eae\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.878616 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-catalog-content\") pod \"6d185637-80f0-4145-b618-e2f865c63eae\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.878716 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-utilities\") pod \"6d185637-80f0-4145-b618-e2f865c63eae\" (UID: \"6d185637-80f0-4145-b618-e2f865c63eae\") " Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.879674 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-utilities" (OuterVolumeSpecName: "utilities") pod "6d185637-80f0-4145-b618-e2f865c63eae" (UID: "6d185637-80f0-4145-b618-e2f865c63eae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.880311 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.897361 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d185637-80f0-4145-b618-e2f865c63eae-kube-api-access-flg89" (OuterVolumeSpecName: "kube-api-access-flg89") pod "6d185637-80f0-4145-b618-e2f865c63eae" (UID: "6d185637-80f0-4145-b618-e2f865c63eae"). InnerVolumeSpecName "kube-api-access-flg89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.938245 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d185637-80f0-4145-b618-e2f865c63eae" (UID: "6d185637-80f0-4145-b618-e2f865c63eae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.981926 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flg89\" (UniqueName: \"kubernetes.io/projected/6d185637-80f0-4145-b618-e2f865c63eae-kube-api-access-flg89\") on node \"crc\" DevicePath \"\"" Feb 02 13:56:44 crc kubenswrapper[4721]: I0202 13:56:44.981966 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d185637-80f0-4145-b618-e2f865c63eae-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.242892 4721 generic.go:334] "Generic (PLEG): container finished" podID="6d185637-80f0-4145-b618-e2f865c63eae" containerID="259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965" exitCode=0 Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.242941 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pxrt4" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.242957 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxrt4" event={"ID":"6d185637-80f0-4145-b618-e2f865c63eae","Type":"ContainerDied","Data":"259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965"} Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.243026 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pxrt4" event={"ID":"6d185637-80f0-4145-b618-e2f865c63eae","Type":"ContainerDied","Data":"d1497bf13ca863f87cd8aabfa72fe1a83cd44e4483f83f21ded2bf8ec4bd89d0"} Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.243049 4721 scope.go:117] "RemoveContainer" containerID="259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.280645 4721 scope.go:117] "RemoveContainer" containerID="a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.295037 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pxrt4"] Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.308629 4721 scope.go:117] "RemoveContainer" containerID="6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.312633 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pxrt4"] Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.367597 4721 scope.go:117] "RemoveContainer" containerID="259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965" Feb 02 13:56:45 crc kubenswrapper[4721]: E0202 13:56:45.368724 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965\": container with ID starting with 259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965 not found: ID does not exist" containerID="259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.368763 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965"} err="failed to get container status \"259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965\": rpc error: code = NotFound desc = could not find container \"259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965\": container with ID starting with 259c1463334a3bb187b54da633918809d1751ad5502acfa3850c7207c533b965 not found: ID does not exist" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.368789 4721 scope.go:117] "RemoveContainer" containerID="a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6" Feb 02 13:56:45 crc kubenswrapper[4721]: E0202 13:56:45.369130 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6\": container with ID starting with a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6 not found: ID does not exist" containerID="a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.369174 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6"} err="failed to get container status \"a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6\": rpc error: code = NotFound desc = could not find container \"a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6\": container with ID starting with a926d42f9c28e7a97b179531d05ec7b7398048e5d75852e89a0a92e2638525d6 not found: ID does not exist" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.369192 4721 scope.go:117] "RemoveContainer" containerID="6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5" Feb 02 13:56:45 crc kubenswrapper[4721]: E0202 13:56:45.369604 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5\": container with ID starting with 6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5 not found: ID does not exist" containerID="6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5" Feb 02 13:56:45 crc kubenswrapper[4721]: I0202 13:56:45.369700 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5"} err="failed to get container status \"6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5\": rpc error: code = NotFound desc = could not find container \"6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5\": container with ID starting with 6c5b85e99338fca40087a9f82e2ea6718a39fe8628f17dc05efb962529fa27f5 not found: ID does not exist" Feb 02 13:56:46 crc kubenswrapper[4721]: I0202 13:56:46.423805 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d185637-80f0-4145-b618-e2f865c63eae" path="/var/lib/kubelet/pods/6d185637-80f0-4145-b618-e2f865c63eae/volumes" Feb 02 13:56:50 crc kubenswrapper[4721]: I0202 13:56:50.416473 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:56:50 crc kubenswrapper[4721]: E0202 13:56:50.417314 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:57:02 crc kubenswrapper[4721]: I0202 13:57:02.426446 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:57:02 crc kubenswrapper[4721]: E0202 13:57:02.427304 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:57:13 crc kubenswrapper[4721]: I0202 13:57:13.409788 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:57:13 crc kubenswrapper[4721]: E0202 13:57:13.410506 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.365463 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l2rfj"] Feb 02 13:57:24 crc kubenswrapper[4721]: E0202 13:57:24.366604 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d185637-80f0-4145-b618-e2f865c63eae" containerName="registry-server" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.366623 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d185637-80f0-4145-b618-e2f865c63eae" containerName="registry-server" Feb 02 13:57:24 crc kubenswrapper[4721]: E0202 13:57:24.366636 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d185637-80f0-4145-b618-e2f865c63eae" containerName="extract-utilities" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.366645 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d185637-80f0-4145-b618-e2f865c63eae" containerName="extract-utilities" Feb 02 13:57:24 crc kubenswrapper[4721]: E0202 13:57:24.366685 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d185637-80f0-4145-b618-e2f865c63eae" containerName="extract-content" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.366693 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d185637-80f0-4145-b618-e2f865c63eae" containerName="extract-content" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.366999 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d185637-80f0-4145-b618-e2f865c63eae" containerName="registry-server" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.369249 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.376735 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l2rfj"] Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.424928 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5hzs\" (UniqueName: \"kubernetes.io/projected/ce3793f3-fe06-4956-84c0-3811f6449960-kube-api-access-g5hzs\") pod \"certified-operators-l2rfj\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.425189 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-catalog-content\") pod \"certified-operators-l2rfj\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.425690 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-utilities\") pod \"certified-operators-l2rfj\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.527861 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-catalog-content\") pod \"certified-operators-l2rfj\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.528052 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-utilities\") pod \"certified-operators-l2rfj\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.528257 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5hzs\" (UniqueName: \"kubernetes.io/projected/ce3793f3-fe06-4956-84c0-3811f6449960-kube-api-access-g5hzs\") pod \"certified-operators-l2rfj\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.528381 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-catalog-content\") pod \"certified-operators-l2rfj\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.528541 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-utilities\") pod \"certified-operators-l2rfj\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.546729 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5hzs\" (UniqueName: \"kubernetes.io/projected/ce3793f3-fe06-4956-84c0-3811f6449960-kube-api-access-g5hzs\") pod \"certified-operators-l2rfj\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:24 crc kubenswrapper[4721]: I0202 13:57:24.701112 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:25 crc kubenswrapper[4721]: I0202 13:57:25.302396 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l2rfj"] Feb 02 13:57:25 crc kubenswrapper[4721]: I0202 13:57:25.652947 4721 generic.go:334] "Generic (PLEG): container finished" podID="ce3793f3-fe06-4956-84c0-3811f6449960" containerID="416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85" exitCode=0 Feb 02 13:57:25 crc kubenswrapper[4721]: I0202 13:57:25.653052 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2rfj" event={"ID":"ce3793f3-fe06-4956-84c0-3811f6449960","Type":"ContainerDied","Data":"416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85"} Feb 02 13:57:25 crc kubenswrapper[4721]: I0202 13:57:25.653293 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2rfj" event={"ID":"ce3793f3-fe06-4956-84c0-3811f6449960","Type":"ContainerStarted","Data":"814efe839a1fecc5914e839d354f8a1443d99c6e663998cefc39d21b28109536"} Feb 02 13:57:27 crc kubenswrapper[4721]: I0202 13:57:27.410414 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:57:27 crc kubenswrapper[4721]: E0202 13:57:27.410706 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:57:27 crc kubenswrapper[4721]: I0202 13:57:27.961342 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q6jkv"] Feb 02 13:57:27 crc kubenswrapper[4721]: I0202 13:57:27.964558 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:27 crc kubenswrapper[4721]: I0202 13:57:27.979241 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6jkv"] Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.052109 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7625a6ea-aff2-4a16-a62a-fec198126d2f-utilities\") pod \"redhat-operators-q6jkv\" (UID: \"7625a6ea-aff2-4a16-a62a-fec198126d2f\") " pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.052277 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk845\" (UniqueName: \"kubernetes.io/projected/7625a6ea-aff2-4a16-a62a-fec198126d2f-kube-api-access-kk845\") pod \"redhat-operators-q6jkv\" (UID: \"7625a6ea-aff2-4a16-a62a-fec198126d2f\") " pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.052448 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7625a6ea-aff2-4a16-a62a-fec198126d2f-catalog-content\") pod \"redhat-operators-q6jkv\" (UID: \"7625a6ea-aff2-4a16-a62a-fec198126d2f\") " pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.156527 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk845\" (UniqueName: \"kubernetes.io/projected/7625a6ea-aff2-4a16-a62a-fec198126d2f-kube-api-access-kk845\") pod \"redhat-operators-q6jkv\" (UID: \"7625a6ea-aff2-4a16-a62a-fec198126d2f\") " pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.157370 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7625a6ea-aff2-4a16-a62a-fec198126d2f-catalog-content\") pod \"redhat-operators-q6jkv\" (UID: \"7625a6ea-aff2-4a16-a62a-fec198126d2f\") " pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.157668 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7625a6ea-aff2-4a16-a62a-fec198126d2f-utilities\") pod \"redhat-operators-q6jkv\" (UID: \"7625a6ea-aff2-4a16-a62a-fec198126d2f\") " pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.157842 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7625a6ea-aff2-4a16-a62a-fec198126d2f-catalog-content\") pod \"redhat-operators-q6jkv\" (UID: \"7625a6ea-aff2-4a16-a62a-fec198126d2f\") " pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.158198 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7625a6ea-aff2-4a16-a62a-fec198126d2f-utilities\") pod \"redhat-operators-q6jkv\" (UID: \"7625a6ea-aff2-4a16-a62a-fec198126d2f\") " pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.175741 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk845\" (UniqueName: \"kubernetes.io/projected/7625a6ea-aff2-4a16-a62a-fec198126d2f-kube-api-access-kk845\") pod \"redhat-operators-q6jkv\" (UID: \"7625a6ea-aff2-4a16-a62a-fec198126d2f\") " pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.286816 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.681265 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2rfj" event={"ID":"ce3793f3-fe06-4956-84c0-3811f6449960","Type":"ContainerStarted","Data":"20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114"} Feb 02 13:57:28 crc kubenswrapper[4721]: W0202 13:57:28.788810 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7625a6ea_aff2_4a16_a62a_fec198126d2f.slice/crio-444e09281b872c51787242c9be7b8fdce41a8b5549c444fde0bd545c400d7ef0 WatchSource:0}: Error finding container 444e09281b872c51787242c9be7b8fdce41a8b5549c444fde0bd545c400d7ef0: Status 404 returned error can't find the container with id 444e09281b872c51787242c9be7b8fdce41a8b5549c444fde0bd545c400d7ef0 Feb 02 13:57:28 crc kubenswrapper[4721]: I0202 13:57:28.793364 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6jkv"] Feb 02 13:57:29 crc kubenswrapper[4721]: I0202 13:57:29.691855 4721 generic.go:334] "Generic (PLEG): container finished" podID="7625a6ea-aff2-4a16-a62a-fec198126d2f" containerID="c01d56064c94befb707a3cfc3128fb239069c1ed7f645a9ef02cd7d25a3e6539" exitCode=0 Feb 02 13:57:29 crc kubenswrapper[4721]: I0202 13:57:29.691993 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jkv" event={"ID":"7625a6ea-aff2-4a16-a62a-fec198126d2f","Type":"ContainerDied","Data":"c01d56064c94befb707a3cfc3128fb239069c1ed7f645a9ef02cd7d25a3e6539"} Feb 02 13:57:29 crc kubenswrapper[4721]: I0202 13:57:29.692483 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jkv" event={"ID":"7625a6ea-aff2-4a16-a62a-fec198126d2f","Type":"ContainerStarted","Data":"444e09281b872c51787242c9be7b8fdce41a8b5549c444fde0bd545c400d7ef0"} Feb 02 13:57:32 crc kubenswrapper[4721]: I0202 13:57:32.725377 4721 generic.go:334] "Generic (PLEG): container finished" podID="ce3793f3-fe06-4956-84c0-3811f6449960" containerID="20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114" exitCode=0 Feb 02 13:57:32 crc kubenswrapper[4721]: I0202 13:57:32.726238 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2rfj" event={"ID":"ce3793f3-fe06-4956-84c0-3811f6449960","Type":"ContainerDied","Data":"20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114"} Feb 02 13:57:40 crc kubenswrapper[4721]: I0202 13:57:40.417572 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:57:40 crc kubenswrapper[4721]: E0202 13:57:40.418323 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:57:44 crc kubenswrapper[4721]: E0202 13:57:44.287471 4721 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 02 13:57:44 crc kubenswrapper[4721]: E0202 13:57:44.288505 4721 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kk845,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-q6jkv_openshift-marketplace(7625a6ea-aff2-4a16-a62a-fec198126d2f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 13:57:44 crc kubenswrapper[4721]: E0202 13:57:44.290137 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-q6jkv" podUID="7625a6ea-aff2-4a16-a62a-fec198126d2f" Feb 02 13:57:44 crc kubenswrapper[4721]: I0202 13:57:44.856601 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2rfj" event={"ID":"ce3793f3-fe06-4956-84c0-3811f6449960","Type":"ContainerStarted","Data":"eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9"} Feb 02 13:57:44 crc kubenswrapper[4721]: E0202 13:57:44.859936 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-q6jkv" podUID="7625a6ea-aff2-4a16-a62a-fec198126d2f" Feb 02 13:57:44 crc kubenswrapper[4721]: I0202 13:57:44.905793 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l2rfj" podStartSLOduration=3.310618251 podStartE2EDuration="20.905773249s" podCreationTimestamp="2026-02-02 13:57:24 +0000 UTC" firstStartedPulling="2026-02-02 13:57:26.664098939 +0000 UTC m=+3386.966613328" lastFinishedPulling="2026-02-02 13:57:44.259253937 +0000 UTC m=+3404.561768326" observedRunningTime="2026-02-02 13:57:44.899732936 +0000 UTC m=+3405.202247345" watchObservedRunningTime="2026-02-02 13:57:44.905773249 +0000 UTC m=+3405.208287638" Feb 02 13:57:54 crc kubenswrapper[4721]: I0202 13:57:54.409909 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:57:54 crc kubenswrapper[4721]: E0202 13:57:54.410791 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:57:54 crc kubenswrapper[4721]: I0202 13:57:54.701757 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:54 crc kubenswrapper[4721]: I0202 13:57:54.701869 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:54 crc kubenswrapper[4721]: I0202 13:57:54.755042 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:55 crc kubenswrapper[4721]: I0202 13:57:55.012585 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:55 crc kubenswrapper[4721]: I0202 13:57:55.571225 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l2rfj"] Feb 02 13:57:56 crc kubenswrapper[4721]: I0202 13:57:56.982591 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l2rfj" podUID="ce3793f3-fe06-4956-84c0-3811f6449960" containerName="registry-server" containerID="cri-o://eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9" gracePeriod=2 Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.542771 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.691666 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-catalog-content\") pod \"ce3793f3-fe06-4956-84c0-3811f6449960\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.691727 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-utilities\") pod \"ce3793f3-fe06-4956-84c0-3811f6449960\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.691840 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5hzs\" (UniqueName: \"kubernetes.io/projected/ce3793f3-fe06-4956-84c0-3811f6449960-kube-api-access-g5hzs\") pod \"ce3793f3-fe06-4956-84c0-3811f6449960\" (UID: \"ce3793f3-fe06-4956-84c0-3811f6449960\") " Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.692548 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-utilities" (OuterVolumeSpecName: "utilities") pod "ce3793f3-fe06-4956-84c0-3811f6449960" (UID: "ce3793f3-fe06-4956-84c0-3811f6449960"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.693030 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.700452 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3793f3-fe06-4956-84c0-3811f6449960-kube-api-access-g5hzs" (OuterVolumeSpecName: "kube-api-access-g5hzs") pod "ce3793f3-fe06-4956-84c0-3811f6449960" (UID: "ce3793f3-fe06-4956-84c0-3811f6449960"). InnerVolumeSpecName "kube-api-access-g5hzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.745782 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce3793f3-fe06-4956-84c0-3811f6449960" (UID: "ce3793f3-fe06-4956-84c0-3811f6449960"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.795565 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3793f3-fe06-4956-84c0-3811f6449960-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.795619 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5hzs\" (UniqueName: \"kubernetes.io/projected/ce3793f3-fe06-4956-84c0-3811f6449960-kube-api-access-g5hzs\") on node \"crc\" DevicePath \"\"" Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.995750 4721 generic.go:334] "Generic (PLEG): container finished" podID="ce3793f3-fe06-4956-84c0-3811f6449960" containerID="eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9" exitCode=0 Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.995807 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2rfj" event={"ID":"ce3793f3-fe06-4956-84c0-3811f6449960","Type":"ContainerDied","Data":"eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9"} Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.995837 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l2rfj" Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.995859 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l2rfj" event={"ID":"ce3793f3-fe06-4956-84c0-3811f6449960","Type":"ContainerDied","Data":"814efe839a1fecc5914e839d354f8a1443d99c6e663998cefc39d21b28109536"} Feb 02 13:57:57 crc kubenswrapper[4721]: I0202 13:57:57.995879 4721 scope.go:117] "RemoveContainer" containerID="eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9" Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.017135 4721 scope.go:117] "RemoveContainer" containerID="20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114" Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.043315 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l2rfj"] Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.048972 4721 scope.go:117] "RemoveContainer" containerID="416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85" Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.055600 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l2rfj"] Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.109951 4721 scope.go:117] "RemoveContainer" containerID="eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9" Feb 02 13:57:58 crc kubenswrapper[4721]: E0202 13:57:58.110493 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9\": container with ID starting with eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9 not found: ID does not exist" containerID="eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9" Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.110526 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9"} err="failed to get container status \"eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9\": rpc error: code = NotFound desc = could not find container \"eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9\": container with ID starting with eda9a3095650a4417a1d00a1fd3927affd2a6123e2a22b48b305282f96bf93a9 not found: ID does not exist" Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.110545 4721 scope.go:117] "RemoveContainer" containerID="20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114" Feb 02 13:57:58 crc kubenswrapper[4721]: E0202 13:57:58.111226 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114\": container with ID starting with 20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114 not found: ID does not exist" containerID="20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114" Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.111270 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114"} err="failed to get container status \"20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114\": rpc error: code = NotFound desc = could not find container \"20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114\": container with ID starting with 20d8b245a9d340220dba41ff04541a95c6f99a8d0ac94cc7eec398ec30cf7114 not found: ID does not exist" Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.111324 4721 scope.go:117] "RemoveContainer" containerID="416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85" Feb 02 13:57:58 crc kubenswrapper[4721]: E0202 13:57:58.111618 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85\": container with ID starting with 416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85 not found: ID does not exist" containerID="416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85" Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.111654 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85"} err="failed to get container status \"416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85\": rpc error: code = NotFound desc = could not find container \"416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85\": container with ID starting with 416ec678adbc5a8a9292d17b0a54b193e2d634d29af6fe00334987f93e56cf85 not found: ID does not exist" Feb 02 13:57:58 crc kubenswrapper[4721]: I0202 13:57:58.423698 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3793f3-fe06-4956-84c0-3811f6449960" path="/var/lib/kubelet/pods/ce3793f3-fe06-4956-84c0-3811f6449960/volumes" Feb 02 13:58:00 crc kubenswrapper[4721]: I0202 13:58:00.019103 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jkv" event={"ID":"7625a6ea-aff2-4a16-a62a-fec198126d2f","Type":"ContainerStarted","Data":"09a1c914b9db7ff8c4ed2aab89cc37a8e547ffbab3c86e51891ec9356164aa85"} Feb 02 13:58:01 crc kubenswrapper[4721]: I0202 13:58:01.030336 4721 generic.go:334] "Generic (PLEG): container finished" podID="7625a6ea-aff2-4a16-a62a-fec198126d2f" containerID="09a1c914b9db7ff8c4ed2aab89cc37a8e547ffbab3c86e51891ec9356164aa85" exitCode=0 Feb 02 13:58:01 crc kubenswrapper[4721]: I0202 13:58:01.030442 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jkv" event={"ID":"7625a6ea-aff2-4a16-a62a-fec198126d2f","Type":"ContainerDied","Data":"09a1c914b9db7ff8c4ed2aab89cc37a8e547ffbab3c86e51891ec9356164aa85"} Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.049895 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6jkv" event={"ID":"7625a6ea-aff2-4a16-a62a-fec198126d2f","Type":"ContainerStarted","Data":"558cb9db418799365f6b95799667f971c21531f838a4c6731ce4bfc1255d7a15"} Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.070315 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q6jkv" podStartSLOduration=3.984900403 podStartE2EDuration="36.07029726s" podCreationTimestamp="2026-02-02 13:57:27 +0000 UTC" firstStartedPulling="2026-02-02 13:57:29.694087672 +0000 UTC m=+3389.996602061" lastFinishedPulling="2026-02-02 13:58:01.779484529 +0000 UTC m=+3422.081998918" observedRunningTime="2026-02-02 13:58:03.069478598 +0000 UTC m=+3423.371993017" watchObservedRunningTime="2026-02-02 13:58:03.07029726 +0000 UTC m=+3423.372811659" Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.786683 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzhg4"] Feb 02 13:58:03 crc kubenswrapper[4721]: E0202 13:58:03.787305 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3793f3-fe06-4956-84c0-3811f6449960" containerName="extract-content" Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.787331 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3793f3-fe06-4956-84c0-3811f6449960" containerName="extract-content" Feb 02 13:58:03 crc kubenswrapper[4721]: E0202 13:58:03.787364 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3793f3-fe06-4956-84c0-3811f6449960" containerName="registry-server" Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.787373 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3793f3-fe06-4956-84c0-3811f6449960" containerName="registry-server" Feb 02 13:58:03 crc kubenswrapper[4721]: E0202 13:58:03.787384 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3793f3-fe06-4956-84c0-3811f6449960" containerName="extract-utilities" Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.787396 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3793f3-fe06-4956-84c0-3811f6449960" containerName="extract-utilities" Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.787737 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3793f3-fe06-4956-84c0-3811f6449960" containerName="registry-server" Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.790217 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.806091 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzhg4"] Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.947794 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb4ss\" (UniqueName: \"kubernetes.io/projected/4d9d6046-cd2e-407a-8264-66d3a10338a5-kube-api-access-fb4ss\") pod \"redhat-marketplace-gzhg4\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.947889 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-utilities\") pod \"redhat-marketplace-gzhg4\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:03 crc kubenswrapper[4721]: I0202 13:58:03.947954 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-catalog-content\") pod \"redhat-marketplace-gzhg4\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:04 crc kubenswrapper[4721]: I0202 13:58:04.050774 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb4ss\" (UniqueName: \"kubernetes.io/projected/4d9d6046-cd2e-407a-8264-66d3a10338a5-kube-api-access-fb4ss\") pod \"redhat-marketplace-gzhg4\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:04 crc kubenswrapper[4721]: I0202 13:58:04.050858 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-utilities\") pod \"redhat-marketplace-gzhg4\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:04 crc kubenswrapper[4721]: I0202 13:58:04.050907 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-catalog-content\") pod \"redhat-marketplace-gzhg4\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:04 crc kubenswrapper[4721]: I0202 13:58:04.051605 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-catalog-content\") pod \"redhat-marketplace-gzhg4\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:04 crc kubenswrapper[4721]: I0202 13:58:04.052245 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-utilities\") pod \"redhat-marketplace-gzhg4\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:04 crc kubenswrapper[4721]: I0202 13:58:04.073133 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb4ss\" (UniqueName: \"kubernetes.io/projected/4d9d6046-cd2e-407a-8264-66d3a10338a5-kube-api-access-fb4ss\") pod \"redhat-marketplace-gzhg4\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:04 crc kubenswrapper[4721]: I0202 13:58:04.235134 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:04 crc kubenswrapper[4721]: I0202 13:58:04.737716 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzhg4"] Feb 02 13:58:05 crc kubenswrapper[4721]: I0202 13:58:05.073432 4721 generic.go:334] "Generic (PLEG): container finished" podID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerID="40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37" exitCode=0 Feb 02 13:58:05 crc kubenswrapper[4721]: I0202 13:58:05.073467 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzhg4" event={"ID":"4d9d6046-cd2e-407a-8264-66d3a10338a5","Type":"ContainerDied","Data":"40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37"} Feb 02 13:58:05 crc kubenswrapper[4721]: I0202 13:58:05.073488 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzhg4" event={"ID":"4d9d6046-cd2e-407a-8264-66d3a10338a5","Type":"ContainerStarted","Data":"e939bbbba90ed00e2c720add47adf1da47f625f58a29dd833ce0584c2aeb5487"} Feb 02 13:58:07 crc kubenswrapper[4721]: I0202 13:58:07.094667 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzhg4" event={"ID":"4d9d6046-cd2e-407a-8264-66d3a10338a5","Type":"ContainerStarted","Data":"5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4"} Feb 02 13:58:08 crc kubenswrapper[4721]: I0202 13:58:08.286989 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:58:08 crc kubenswrapper[4721]: I0202 13:58:08.287345 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:58:08 crc kubenswrapper[4721]: I0202 13:58:08.332169 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:58:08 crc kubenswrapper[4721]: I0202 13:58:08.410026 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:58:08 crc kubenswrapper[4721]: E0202 13:58:08.410320 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 13:58:09 crc kubenswrapper[4721]: I0202 13:58:09.191700 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q6jkv" Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:09.999743 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6jkv"] Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.141511 4721 generic.go:334] "Generic (PLEG): container finished" podID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerID="5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4" exitCode=0 Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.141548 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzhg4" event={"ID":"4d9d6046-cd2e-407a-8264-66d3a10338a5","Type":"ContainerDied","Data":"5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4"} Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.180506 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gc4db"] Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.180767 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gc4db" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerName="registry-server" containerID="cri-o://f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a" gracePeriod=2 Feb 02 13:58:10 crc kubenswrapper[4721]: E0202 13:58:10.438813 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d4d1a7c_52fd_456d_ab0e_78a9c4529fd1.slice/crio-f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a.scope\": RecentStats: unable to find data in memory cache]" Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.753733 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.821912 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9grc\" (UniqueName: \"kubernetes.io/projected/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-kube-api-access-f9grc\") pod \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.822003 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-catalog-content\") pod \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.822162 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-utilities\") pod \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\" (UID: \"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1\") " Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.822770 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-utilities" (OuterVolumeSpecName: "utilities") pod "3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" (UID: "3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.823022 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.835336 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-kube-api-access-f9grc" (OuterVolumeSpecName: "kube-api-access-f9grc") pod "3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" (UID: "3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1"). InnerVolumeSpecName "kube-api-access-f9grc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.924987 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9grc\" (UniqueName: \"kubernetes.io/projected/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-kube-api-access-f9grc\") on node \"crc\" DevicePath \"\"" Feb 02 13:58:10 crc kubenswrapper[4721]: I0202 13:58:10.953258 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" (UID: "3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.029257 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.159840 4721 generic.go:334] "Generic (PLEG): container finished" podID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerID="f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a" exitCode=0 Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.159924 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gc4db" event={"ID":"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1","Type":"ContainerDied","Data":"f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a"} Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.159957 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gc4db" event={"ID":"3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1","Type":"ContainerDied","Data":"c443084e3e1b254cd6cca1fcfdbe64c90be36a3dfa2b67b63b03d3820015e610"} Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.159977 4721 scope.go:117] "RemoveContainer" containerID="f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.160280 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gc4db" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.168684 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzhg4" event={"ID":"4d9d6046-cd2e-407a-8264-66d3a10338a5","Type":"ContainerStarted","Data":"509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea"} Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.201830 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzhg4" podStartSLOduration=2.71937431 podStartE2EDuration="8.201809495s" podCreationTimestamp="2026-02-02 13:58:03 +0000 UTC" firstStartedPulling="2026-02-02 13:58:05.076497433 +0000 UTC m=+3425.379011822" lastFinishedPulling="2026-02-02 13:58:10.558932608 +0000 UTC m=+3430.861447007" observedRunningTime="2026-02-02 13:58:11.192752722 +0000 UTC m=+3431.495267111" watchObservedRunningTime="2026-02-02 13:58:11.201809495 +0000 UTC m=+3431.504323874" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.206111 4721 scope.go:117] "RemoveContainer" containerID="cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.231229 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gc4db"] Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.249900 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gc4db"] Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.259285 4721 scope.go:117] "RemoveContainer" containerID="b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.318445 4721 scope.go:117] "RemoveContainer" containerID="f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a" Feb 02 13:58:11 crc kubenswrapper[4721]: E0202 13:58:11.318874 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a\": container with ID starting with f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a not found: ID does not exist" containerID="f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.318905 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a"} err="failed to get container status \"f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a\": rpc error: code = NotFound desc = could not find container \"f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a\": container with ID starting with f83c45e1ff378ffe8b7f795fb252604530ddd6b5c22fbafb2acc28ce75d5835a not found: ID does not exist" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.318934 4721 scope.go:117] "RemoveContainer" containerID="cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6" Feb 02 13:58:11 crc kubenswrapper[4721]: E0202 13:58:11.319225 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6\": container with ID starting with cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6 not found: ID does not exist" containerID="cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.319256 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6"} err="failed to get container status \"cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6\": rpc error: code = NotFound desc = could not find container \"cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6\": container with ID starting with cd748e7fb6f12d62b84d0d6332d25370d2d67d418388354bb68e32e6ced809d6 not found: ID does not exist" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.319285 4721 scope.go:117] "RemoveContainer" containerID="b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f" Feb 02 13:58:11 crc kubenswrapper[4721]: E0202 13:58:11.319651 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f\": container with ID starting with b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f not found: ID does not exist" containerID="b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f" Feb 02 13:58:11 crc kubenswrapper[4721]: I0202 13:58:11.319709 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f"} err="failed to get container status \"b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f\": rpc error: code = NotFound desc = could not find container \"b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f\": container with ID starting with b146ac5d525d99b3d9e62ac67dc0abf39c407bd8cadfc21eb11915cc07946f3f not found: ID does not exist" Feb 02 13:58:12 crc kubenswrapper[4721]: I0202 13:58:12.425219 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" path="/var/lib/kubelet/pods/3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1/volumes" Feb 02 13:58:14 crc kubenswrapper[4721]: I0202 13:58:14.235441 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:14 crc kubenswrapper[4721]: I0202 13:58:14.235817 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:14 crc kubenswrapper[4721]: I0202 13:58:14.291275 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:15 crc kubenswrapper[4721]: I0202 13:58:15.267605 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:16 crc kubenswrapper[4721]: I0202 13:58:16.370731 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzhg4"] Feb 02 13:58:17 crc kubenswrapper[4721]: I0202 13:58:17.231297 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gzhg4" podUID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerName="registry-server" containerID="cri-o://509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea" gracePeriod=2 Feb 02 13:58:17 crc kubenswrapper[4721]: I0202 13:58:17.799436 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:17 crc kubenswrapper[4721]: I0202 13:58:17.916542 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-utilities\") pod \"4d9d6046-cd2e-407a-8264-66d3a10338a5\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " Feb 02 13:58:17 crc kubenswrapper[4721]: I0202 13:58:17.916709 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb4ss\" (UniqueName: \"kubernetes.io/projected/4d9d6046-cd2e-407a-8264-66d3a10338a5-kube-api-access-fb4ss\") pod \"4d9d6046-cd2e-407a-8264-66d3a10338a5\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " Feb 02 13:58:17 crc kubenswrapper[4721]: I0202 13:58:17.916738 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-catalog-content\") pod \"4d9d6046-cd2e-407a-8264-66d3a10338a5\" (UID: \"4d9d6046-cd2e-407a-8264-66d3a10338a5\") " Feb 02 13:58:17 crc kubenswrapper[4721]: I0202 13:58:17.917364 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-utilities" (OuterVolumeSpecName: "utilities") pod "4d9d6046-cd2e-407a-8264-66d3a10338a5" (UID: "4d9d6046-cd2e-407a-8264-66d3a10338a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:58:17 crc kubenswrapper[4721]: I0202 13:58:17.918085 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:58:17 crc kubenswrapper[4721]: I0202 13:58:17.924093 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d9d6046-cd2e-407a-8264-66d3a10338a5-kube-api-access-fb4ss" (OuterVolumeSpecName: "kube-api-access-fb4ss") pod "4d9d6046-cd2e-407a-8264-66d3a10338a5" (UID: "4d9d6046-cd2e-407a-8264-66d3a10338a5"). InnerVolumeSpecName "kube-api-access-fb4ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:58:17 crc kubenswrapper[4721]: I0202 13:58:17.942614 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d9d6046-cd2e-407a-8264-66d3a10338a5" (UID: "4d9d6046-cd2e-407a-8264-66d3a10338a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.020102 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb4ss\" (UniqueName: \"kubernetes.io/projected/4d9d6046-cd2e-407a-8264-66d3a10338a5-kube-api-access-fb4ss\") on node \"crc\" DevicePath \"\"" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.020434 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9d6046-cd2e-407a-8264-66d3a10338a5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.245025 4721 generic.go:334] "Generic (PLEG): container finished" podID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerID="509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea" exitCode=0 Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.245161 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzhg4" event={"ID":"4d9d6046-cd2e-407a-8264-66d3a10338a5","Type":"ContainerDied","Data":"509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea"} Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.245174 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzhg4" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.245192 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzhg4" event={"ID":"4d9d6046-cd2e-407a-8264-66d3a10338a5","Type":"ContainerDied","Data":"e939bbbba90ed00e2c720add47adf1da47f625f58a29dd833ce0584c2aeb5487"} Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.245214 4721 scope.go:117] "RemoveContainer" containerID="509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.265643 4721 scope.go:117] "RemoveContainer" containerID="5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.283320 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzhg4"] Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.294408 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzhg4"] Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.295881 4721 scope.go:117] "RemoveContainer" containerID="40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.358639 4721 scope.go:117] "RemoveContainer" containerID="509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea" Feb 02 13:58:18 crc kubenswrapper[4721]: E0202 13:58:18.359034 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea\": container with ID starting with 509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea not found: ID does not exist" containerID="509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.359088 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea"} err="failed to get container status \"509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea\": rpc error: code = NotFound desc = could not find container \"509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea\": container with ID starting with 509d659a81e9649ed16c4124a959aa0f51c137c5e2085821387b15034b60cdea not found: ID does not exist" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.359115 4721 scope.go:117] "RemoveContainer" containerID="5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4" Feb 02 13:58:18 crc kubenswrapper[4721]: E0202 13:58:18.359444 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4\": container with ID starting with 5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4 not found: ID does not exist" containerID="5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.359473 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4"} err="failed to get container status \"5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4\": rpc error: code = NotFound desc = could not find container \"5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4\": container with ID starting with 5b58ce4ab5d56c7b60cf36de1bef73a6bb590aab021624224e5cad64c6ef0fc4 not found: ID does not exist" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.359493 4721 scope.go:117] "RemoveContainer" containerID="40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37" Feb 02 13:58:18 crc kubenswrapper[4721]: E0202 13:58:18.359881 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37\": container with ID starting with 40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37 not found: ID does not exist" containerID="40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.359904 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37"} err="failed to get container status \"40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37\": rpc error: code = NotFound desc = could not find container \"40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37\": container with ID starting with 40ca39e7055aabebd0ab710e77cf21f9170c4e29e95631c702dc47ca462e9e37 not found: ID does not exist" Feb 02 13:58:18 crc kubenswrapper[4721]: I0202 13:58:18.423033 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d9d6046-cd2e-407a-8264-66d3a10338a5" path="/var/lib/kubelet/pods/4d9d6046-cd2e-407a-8264-66d3a10338a5/volumes" Feb 02 13:58:21 crc kubenswrapper[4721]: I0202 13:58:21.410429 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 13:58:22 crc kubenswrapper[4721]: I0202 13:58:22.283792 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"a3cd5fb955e420aefedf3293a111222e0af1cbe46d543be8593a16454e1d2d8d"} Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.227829 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv"] Feb 02 14:00:00 crc kubenswrapper[4721]: E0202 14:00:00.229542 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerName="extract-utilities" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.229566 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerName="extract-utilities" Feb 02 14:00:00 crc kubenswrapper[4721]: E0202 14:00:00.229619 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerName="registry-server" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.229629 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerName="registry-server" Feb 02 14:00:00 crc kubenswrapper[4721]: E0202 14:00:00.229653 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerName="extract-content" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.229662 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerName="extract-content" Feb 02 14:00:00 crc kubenswrapper[4721]: E0202 14:00:00.229678 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerName="registry-server" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.229684 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerName="registry-server" Feb 02 14:00:00 crc kubenswrapper[4721]: E0202 14:00:00.229701 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerName="extract-content" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.229707 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerName="extract-content" Feb 02 14:00:00 crc kubenswrapper[4721]: E0202 14:00:00.229739 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerName="extract-utilities" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.229747 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerName="extract-utilities" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.231151 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4d1a7c-52fd-456d-ab0e-78a9c4529fd1" containerName="registry-server" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.231211 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d9d6046-cd2e-407a-8264-66d3a10338a5" containerName="registry-server" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.232516 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.235901 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.236119 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.244160 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv"] Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.409510 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-config-volume\") pod \"collect-profiles-29500680-8fnmv\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.409614 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbcb4\" (UniqueName: \"kubernetes.io/projected/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-kube-api-access-rbcb4\") pod \"collect-profiles-29500680-8fnmv\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.409659 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-secret-volume\") pod \"collect-profiles-29500680-8fnmv\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.512293 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-config-volume\") pod \"collect-profiles-29500680-8fnmv\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.512415 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbcb4\" (UniqueName: \"kubernetes.io/projected/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-kube-api-access-rbcb4\") pod \"collect-profiles-29500680-8fnmv\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.512459 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-secret-volume\") pod \"collect-profiles-29500680-8fnmv\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.515537 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.519655 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-secret-volume\") pod \"collect-profiles-29500680-8fnmv\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.527756 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-config-volume\") pod \"collect-profiles-29500680-8fnmv\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.533737 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbcb4\" (UniqueName: \"kubernetes.io/projected/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-kube-api-access-rbcb4\") pod \"collect-profiles-29500680-8fnmv\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.564261 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 14:00:00 crc kubenswrapper[4721]: I0202 14:00:00.573351 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:01 crc kubenswrapper[4721]: I0202 14:00:01.067217 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv"] Feb 02 14:00:01 crc kubenswrapper[4721]: I0202 14:00:01.296197 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" event={"ID":"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f","Type":"ContainerStarted","Data":"29f49894455bd7961b6e45a7b07191bfd0b099e4c502956048b722e54f7a2403"} Feb 02 14:00:02 crc kubenswrapper[4721]: I0202 14:00:02.305780 4721 generic.go:334] "Generic (PLEG): container finished" podID="a3a359c1-b4b9-4b88-8bc0-da7c97b7225f" containerID="626dacec11ff7af02b246606cab5fd3871139e6bcc650e26f41f5bc94d95a99b" exitCode=0 Feb 02 14:00:02 crc kubenswrapper[4721]: I0202 14:00:02.305839 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" event={"ID":"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f","Type":"ContainerDied","Data":"626dacec11ff7af02b246606cab5fd3871139e6bcc650e26f41f5bc94d95a99b"} Feb 02 14:00:03 crc kubenswrapper[4721]: I0202 14:00:03.744804 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:03 crc kubenswrapper[4721]: I0202 14:00:03.897815 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-secret-volume\") pod \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " Feb 02 14:00:03 crc kubenswrapper[4721]: I0202 14:00:03.898003 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbcb4\" (UniqueName: \"kubernetes.io/projected/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-kube-api-access-rbcb4\") pod \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " Feb 02 14:00:03 crc kubenswrapper[4721]: I0202 14:00:03.898119 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-config-volume\") pod \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\" (UID: \"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f\") " Feb 02 14:00:03 crc kubenswrapper[4721]: I0202 14:00:03.898796 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a3a359c1-b4b9-4b88-8bc0-da7c97b7225f" (UID: "a3a359c1-b4b9-4b88-8bc0-da7c97b7225f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 14:00:03 crc kubenswrapper[4721]: I0202 14:00:03.904198 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a3a359c1-b4b9-4b88-8bc0-da7c97b7225f" (UID: "a3a359c1-b4b9-4b88-8bc0-da7c97b7225f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 14:00:03 crc kubenswrapper[4721]: I0202 14:00:03.904777 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-kube-api-access-rbcb4" (OuterVolumeSpecName: "kube-api-access-rbcb4") pod "a3a359c1-b4b9-4b88-8bc0-da7c97b7225f" (UID: "a3a359c1-b4b9-4b88-8bc0-da7c97b7225f"). InnerVolumeSpecName "kube-api-access-rbcb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:00:04 crc kubenswrapper[4721]: I0202 14:00:04.001856 4721 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 14:00:04 crc kubenswrapper[4721]: I0202 14:00:04.001901 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbcb4\" (UniqueName: \"kubernetes.io/projected/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-kube-api-access-rbcb4\") on node \"crc\" DevicePath \"\"" Feb 02 14:00:04 crc kubenswrapper[4721]: I0202 14:00:04.001915 4721 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3a359c1-b4b9-4b88-8bc0-da7c97b7225f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 14:00:04 crc kubenswrapper[4721]: I0202 14:00:04.328498 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" event={"ID":"a3a359c1-b4b9-4b88-8bc0-da7c97b7225f","Type":"ContainerDied","Data":"29f49894455bd7961b6e45a7b07191bfd0b099e4c502956048b722e54f7a2403"} Feb 02 14:00:04 crc kubenswrapper[4721]: I0202 14:00:04.328536 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29f49894455bd7961b6e45a7b07191bfd0b099e4c502956048b722e54f7a2403" Feb 02 14:00:04 crc kubenswrapper[4721]: I0202 14:00:04.328598 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500680-8fnmv" Feb 02 14:00:04 crc kubenswrapper[4721]: I0202 14:00:04.870311 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn"] Feb 02 14:00:04 crc kubenswrapper[4721]: I0202 14:00:04.885157 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500635-qjbdn"] Feb 02 14:00:06 crc kubenswrapper[4721]: I0202 14:00:06.435832 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c984cb-b059-4e3f-86f2-8abca8e6942e" path="/var/lib/kubelet/pods/13c984cb-b059-4e3f-86f2-8abca8e6942e/volumes" Feb 02 14:00:14 crc kubenswrapper[4721]: I0202 14:00:14.516452 4721 scope.go:117] "RemoveContainer" containerID="dbd339a45a88197a4052721f67969cee0f84e4e520076f5a19a1a3e14ab9298f" Feb 02 14:00:44 crc kubenswrapper[4721]: I0202 14:00:44.763466 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:00:44 crc kubenswrapper[4721]: I0202 14:00:44.764080 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.444719 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29500681-7xs4g"] Feb 02 14:01:00 crc kubenswrapper[4721]: E0202 14:01:00.446703 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a359c1-b4b9-4b88-8bc0-da7c97b7225f" containerName="collect-profiles" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.446787 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a359c1-b4b9-4b88-8bc0-da7c97b7225f" containerName="collect-profiles" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.447092 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a359c1-b4b9-4b88-8bc0-da7c97b7225f" containerName="collect-profiles" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.447910 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500681-7xs4g"] Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.448060 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.568087 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/9b53b618-4727-4a17-a000-ba0ccd1084c1-kube-api-access-8vxcl\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.568244 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-fernet-keys\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.568377 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-config-data\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.569058 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-combined-ca-bundle\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.671018 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/9b53b618-4727-4a17-a000-ba0ccd1084c1-kube-api-access-8vxcl\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.671138 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-fernet-keys\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.671178 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-config-data\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.671291 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-combined-ca-bundle\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.679230 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-combined-ca-bundle\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.679431 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-fernet-keys\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.679772 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-config-data\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.691140 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/9b53b618-4727-4a17-a000-ba0ccd1084c1-kube-api-access-8vxcl\") pod \"keystone-cron-29500681-7xs4g\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:00 crc kubenswrapper[4721]: I0202 14:01:00.773041 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:01 crc kubenswrapper[4721]: I0202 14:01:01.261797 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500681-7xs4g"] Feb 02 14:01:01 crc kubenswrapper[4721]: I0202 14:01:01.388316 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500681-7xs4g" event={"ID":"9b53b618-4727-4a17-a000-ba0ccd1084c1","Type":"ContainerStarted","Data":"06a522d4079d336fac9999ebe269964f7faeeaac86a9831fc3f7d27a0e2e96ef"} Feb 02 14:01:02 crc kubenswrapper[4721]: I0202 14:01:02.402993 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500681-7xs4g" event={"ID":"9b53b618-4727-4a17-a000-ba0ccd1084c1","Type":"ContainerStarted","Data":"9ab18faa360117f946acf77d5bbb30958f5c311100041690cf4612b97b1423d6"} Feb 02 14:01:02 crc kubenswrapper[4721]: I0202 14:01:02.430297 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29500681-7xs4g" podStartSLOduration=2.430276607 podStartE2EDuration="2.430276607s" podCreationTimestamp="2026-02-02 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 14:01:02.424000997 +0000 UTC m=+3602.726515386" watchObservedRunningTime="2026-02-02 14:01:02.430276607 +0000 UTC m=+3602.732790996" Feb 02 14:01:07 crc kubenswrapper[4721]: I0202 14:01:07.456701 4721 generic.go:334] "Generic (PLEG): container finished" podID="9b53b618-4727-4a17-a000-ba0ccd1084c1" containerID="9ab18faa360117f946acf77d5bbb30958f5c311100041690cf4612b97b1423d6" exitCode=0 Feb 02 14:01:07 crc kubenswrapper[4721]: I0202 14:01:07.457284 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500681-7xs4g" event={"ID":"9b53b618-4727-4a17-a000-ba0ccd1084c1","Type":"ContainerDied","Data":"9ab18faa360117f946acf77d5bbb30958f5c311100041690cf4612b97b1423d6"} Feb 02 14:01:08 crc kubenswrapper[4721]: I0202 14:01:08.828264 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:08 crc kubenswrapper[4721]: I0202 14:01:08.916132 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-fernet-keys\") pod \"9b53b618-4727-4a17-a000-ba0ccd1084c1\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " Feb 02 14:01:08 crc kubenswrapper[4721]: I0202 14:01:08.916368 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/9b53b618-4727-4a17-a000-ba0ccd1084c1-kube-api-access-8vxcl\") pod \"9b53b618-4727-4a17-a000-ba0ccd1084c1\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " Feb 02 14:01:08 crc kubenswrapper[4721]: I0202 14:01:08.916481 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-combined-ca-bundle\") pod \"9b53b618-4727-4a17-a000-ba0ccd1084c1\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " Feb 02 14:01:08 crc kubenswrapper[4721]: I0202 14:01:08.916545 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-config-data\") pod \"9b53b618-4727-4a17-a000-ba0ccd1084c1\" (UID: \"9b53b618-4727-4a17-a000-ba0ccd1084c1\") " Feb 02 14:01:08 crc kubenswrapper[4721]: I0202 14:01:08.923370 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9b53b618-4727-4a17-a000-ba0ccd1084c1" (UID: "9b53b618-4727-4a17-a000-ba0ccd1084c1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 14:01:08 crc kubenswrapper[4721]: I0202 14:01:08.923678 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b53b618-4727-4a17-a000-ba0ccd1084c1-kube-api-access-8vxcl" (OuterVolumeSpecName: "kube-api-access-8vxcl") pod "9b53b618-4727-4a17-a000-ba0ccd1084c1" (UID: "9b53b618-4727-4a17-a000-ba0ccd1084c1"). InnerVolumeSpecName "kube-api-access-8vxcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:01:08 crc kubenswrapper[4721]: I0202 14:01:08.954298 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b53b618-4727-4a17-a000-ba0ccd1084c1" (UID: "9b53b618-4727-4a17-a000-ba0ccd1084c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 14:01:08 crc kubenswrapper[4721]: I0202 14:01:08.987059 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-config-data" (OuterVolumeSpecName: "config-data") pod "9b53b618-4727-4a17-a000-ba0ccd1084c1" (UID: "9b53b618-4727-4a17-a000-ba0ccd1084c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 14:01:09 crc kubenswrapper[4721]: I0202 14:01:09.019976 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vxcl\" (UniqueName: \"kubernetes.io/projected/9b53b618-4727-4a17-a000-ba0ccd1084c1-kube-api-access-8vxcl\") on node \"crc\" DevicePath \"\"" Feb 02 14:01:09 crc kubenswrapper[4721]: I0202 14:01:09.020015 4721 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 14:01:09 crc kubenswrapper[4721]: I0202 14:01:09.020027 4721 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 14:01:09 crc kubenswrapper[4721]: I0202 14:01:09.020039 4721 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b53b618-4727-4a17-a000-ba0ccd1084c1-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 14:01:09 crc kubenswrapper[4721]: I0202 14:01:09.478859 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500681-7xs4g" event={"ID":"9b53b618-4727-4a17-a000-ba0ccd1084c1","Type":"ContainerDied","Data":"06a522d4079d336fac9999ebe269964f7faeeaac86a9831fc3f7d27a0e2e96ef"} Feb 02 14:01:09 crc kubenswrapper[4721]: I0202 14:01:09.478900 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06a522d4079d336fac9999ebe269964f7faeeaac86a9831fc3f7d27a0e2e96ef" Feb 02 14:01:09 crc kubenswrapper[4721]: I0202 14:01:09.478933 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500681-7xs4g" Feb 02 14:01:14 crc kubenswrapper[4721]: I0202 14:01:14.763556 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:01:14 crc kubenswrapper[4721]: I0202 14:01:14.765336 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:01:44 crc kubenswrapper[4721]: I0202 14:01:44.763142 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:01:44 crc kubenswrapper[4721]: I0202 14:01:44.763663 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:01:44 crc kubenswrapper[4721]: I0202 14:01:44.763707 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 14:01:44 crc kubenswrapper[4721]: I0202 14:01:44.764804 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3cd5fb955e420aefedf3293a111222e0af1cbe46d543be8593a16454e1d2d8d"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 14:01:44 crc kubenswrapper[4721]: I0202 14:01:44.764874 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://a3cd5fb955e420aefedf3293a111222e0af1cbe46d543be8593a16454e1d2d8d" gracePeriod=600 Feb 02 14:01:45 crc kubenswrapper[4721]: I0202 14:01:45.908187 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="a3cd5fb955e420aefedf3293a111222e0af1cbe46d543be8593a16454e1d2d8d" exitCode=0 Feb 02 14:01:45 crc kubenswrapper[4721]: I0202 14:01:45.908275 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"a3cd5fb955e420aefedf3293a111222e0af1cbe46d543be8593a16454e1d2d8d"} Feb 02 14:01:45 crc kubenswrapper[4721]: I0202 14:01:45.908835 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3"} Feb 02 14:01:45 crc kubenswrapper[4721]: I0202 14:01:45.908873 4721 scope.go:117] "RemoveContainer" containerID="b7871a57b1ca8a3b9cb67a09becc2a2b69da768974a437742c318057afd45c95" Feb 02 14:04:14 crc kubenswrapper[4721]: I0202 14:04:14.764246 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:04:14 crc kubenswrapper[4721]: I0202 14:04:14.764782 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:04:44 crc kubenswrapper[4721]: I0202 14:04:44.763238 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:04:44 crc kubenswrapper[4721]: I0202 14:04:44.763859 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:05:14 crc kubenswrapper[4721]: I0202 14:05:14.763211 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:05:14 crc kubenswrapper[4721]: I0202 14:05:14.763895 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:05:14 crc kubenswrapper[4721]: I0202 14:05:14.763944 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 14:05:14 crc kubenswrapper[4721]: I0202 14:05:14.764953 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 14:05:14 crc kubenswrapper[4721]: I0202 14:05:14.765014 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" gracePeriod=600 Feb 02 14:05:14 crc kubenswrapper[4721]: E0202 14:05:14.913970 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:05:15 crc kubenswrapper[4721]: I0202 14:05:15.116511 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" exitCode=0 Feb 02 14:05:15 crc kubenswrapper[4721]: I0202 14:05:15.116585 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3"} Feb 02 14:05:15 crc kubenswrapper[4721]: I0202 14:05:15.116864 4721 scope.go:117] "RemoveContainer" containerID="a3cd5fb955e420aefedf3293a111222e0af1cbe46d543be8593a16454e1d2d8d" Feb 02 14:05:15 crc kubenswrapper[4721]: I0202 14:05:15.117605 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:05:15 crc kubenswrapper[4721]: E0202 14:05:15.117879 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:05:30 crc kubenswrapper[4721]: I0202 14:05:30.416357 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:05:30 crc kubenswrapper[4721]: E0202 14:05:30.417224 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:05:44 crc kubenswrapper[4721]: I0202 14:05:44.409741 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:05:44 crc kubenswrapper[4721]: E0202 14:05:44.410544 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:05:55 crc kubenswrapper[4721]: E0202 14:05:55.319054 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Feb 02 14:05:57 crc kubenswrapper[4721]: I0202 14:05:57.411735 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:05:57 crc kubenswrapper[4721]: E0202 14:05:57.412407 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:06:10 crc kubenswrapper[4721]: I0202 14:06:10.410308 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:06:10 crc kubenswrapper[4721]: E0202 14:06:10.410923 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:06:25 crc kubenswrapper[4721]: I0202 14:06:25.409905 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:06:25 crc kubenswrapper[4721]: E0202 14:06:25.410818 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:06:38 crc kubenswrapper[4721]: I0202 14:06:38.409913 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:06:38 crc kubenswrapper[4721]: E0202 14:06:38.410940 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:06:50 crc kubenswrapper[4721]: I0202 14:06:50.419354 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:06:50 crc kubenswrapper[4721]: E0202 14:06:50.420239 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:07:04 crc kubenswrapper[4721]: I0202 14:07:04.409771 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:07:04 crc kubenswrapper[4721]: E0202 14:07:04.410767 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:07:14 crc kubenswrapper[4721]: I0202 14:07:14.981386 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s6b8c"] Feb 02 14:07:14 crc kubenswrapper[4721]: E0202 14:07:14.985257 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b53b618-4727-4a17-a000-ba0ccd1084c1" containerName="keystone-cron" Feb 02 14:07:14 crc kubenswrapper[4721]: I0202 14:07:14.985295 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b53b618-4727-4a17-a000-ba0ccd1084c1" containerName="keystone-cron" Feb 02 14:07:14 crc kubenswrapper[4721]: I0202 14:07:14.985607 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b53b618-4727-4a17-a000-ba0ccd1084c1" containerName="keystone-cron" Feb 02 14:07:14 crc kubenswrapper[4721]: I0202 14:07:14.987781 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.043408 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6b8c"] Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.056893 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9xpt\" (UniqueName: \"kubernetes.io/projected/132d342c-14e7-4cf5-a57b-ec168398bcd6-kube-api-access-g9xpt\") pod \"community-operators-s6b8c\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.057094 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-catalog-content\") pod \"community-operators-s6b8c\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.057129 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-utilities\") pod \"community-operators-s6b8c\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.159350 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-catalog-content\") pod \"community-operators-s6b8c\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.159399 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-utilities\") pod \"community-operators-s6b8c\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.159568 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9xpt\" (UniqueName: \"kubernetes.io/projected/132d342c-14e7-4cf5-a57b-ec168398bcd6-kube-api-access-g9xpt\") pod \"community-operators-s6b8c\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.160282 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-catalog-content\") pod \"community-operators-s6b8c\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.160369 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-utilities\") pod \"community-operators-s6b8c\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.179030 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9xpt\" (UniqueName: \"kubernetes.io/projected/132d342c-14e7-4cf5-a57b-ec168398bcd6-kube-api-access-g9xpt\") pod \"community-operators-s6b8c\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:15 crc kubenswrapper[4721]: I0202 14:07:15.375444 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:16 crc kubenswrapper[4721]: I0202 14:07:16.080733 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s6b8c"] Feb 02 14:07:16 crc kubenswrapper[4721]: I0202 14:07:16.368994 4721 generic.go:334] "Generic (PLEG): container finished" podID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerID="a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c" exitCode=0 Feb 02 14:07:16 crc kubenswrapper[4721]: I0202 14:07:16.369164 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6b8c" event={"ID":"132d342c-14e7-4cf5-a57b-ec168398bcd6","Type":"ContainerDied","Data":"a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c"} Feb 02 14:07:16 crc kubenswrapper[4721]: I0202 14:07:16.369280 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6b8c" event={"ID":"132d342c-14e7-4cf5-a57b-ec168398bcd6","Type":"ContainerStarted","Data":"9e948eff55381bb46ecc98162f082b8ff09ac0413fd3e4474bf9ea804dd01688"} Feb 02 14:07:16 crc kubenswrapper[4721]: I0202 14:07:16.372510 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 14:07:18 crc kubenswrapper[4721]: I0202 14:07:18.395205 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6b8c" event={"ID":"132d342c-14e7-4cf5-a57b-ec168398bcd6","Type":"ContainerStarted","Data":"be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3"} Feb 02 14:07:19 crc kubenswrapper[4721]: I0202 14:07:19.410622 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:07:19 crc kubenswrapper[4721]: E0202 14:07:19.411355 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:07:19 crc kubenswrapper[4721]: I0202 14:07:19.412704 4721 generic.go:334] "Generic (PLEG): container finished" podID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerID="be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3" exitCode=0 Feb 02 14:07:19 crc kubenswrapper[4721]: I0202 14:07:19.412742 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6b8c" event={"ID":"132d342c-14e7-4cf5-a57b-ec168398bcd6","Type":"ContainerDied","Data":"be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3"} Feb 02 14:07:20 crc kubenswrapper[4721]: I0202 14:07:20.428060 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6b8c" event={"ID":"132d342c-14e7-4cf5-a57b-ec168398bcd6","Type":"ContainerStarted","Data":"13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf"} Feb 02 14:07:20 crc kubenswrapper[4721]: I0202 14:07:20.459141 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s6b8c" podStartSLOduration=2.984388626 podStartE2EDuration="6.459116205s" podCreationTimestamp="2026-02-02 14:07:14 +0000 UTC" firstStartedPulling="2026-02-02 14:07:16.372233797 +0000 UTC m=+3976.674748176" lastFinishedPulling="2026-02-02 14:07:19.846961366 +0000 UTC m=+3980.149475755" observedRunningTime="2026-02-02 14:07:20.446263197 +0000 UTC m=+3980.748777586" watchObservedRunningTime="2026-02-02 14:07:20.459116205 +0000 UTC m=+3980.761630594" Feb 02 14:07:25 crc kubenswrapper[4721]: I0202 14:07:25.375962 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:25 crc kubenswrapper[4721]: I0202 14:07:25.376407 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:25 crc kubenswrapper[4721]: I0202 14:07:25.425789 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:25 crc kubenswrapper[4721]: I0202 14:07:25.532758 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:25 crc kubenswrapper[4721]: I0202 14:07:25.686836 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6b8c"] Feb 02 14:07:27 crc kubenswrapper[4721]: I0202 14:07:27.502463 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s6b8c" podUID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerName="registry-server" containerID="cri-o://13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf" gracePeriod=2 Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.120580 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.266410 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9xpt\" (UniqueName: \"kubernetes.io/projected/132d342c-14e7-4cf5-a57b-ec168398bcd6-kube-api-access-g9xpt\") pod \"132d342c-14e7-4cf5-a57b-ec168398bcd6\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.266631 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-catalog-content\") pod \"132d342c-14e7-4cf5-a57b-ec168398bcd6\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.266759 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-utilities\") pod \"132d342c-14e7-4cf5-a57b-ec168398bcd6\" (UID: \"132d342c-14e7-4cf5-a57b-ec168398bcd6\") " Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.268289 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-utilities" (OuterVolumeSpecName: "utilities") pod "132d342c-14e7-4cf5-a57b-ec168398bcd6" (UID: "132d342c-14e7-4cf5-a57b-ec168398bcd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.284316 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132d342c-14e7-4cf5-a57b-ec168398bcd6-kube-api-access-g9xpt" (OuterVolumeSpecName: "kube-api-access-g9xpt") pod "132d342c-14e7-4cf5-a57b-ec168398bcd6" (UID: "132d342c-14e7-4cf5-a57b-ec168398bcd6"). InnerVolumeSpecName "kube-api-access-g9xpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.377046 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9xpt\" (UniqueName: \"kubernetes.io/projected/132d342c-14e7-4cf5-a57b-ec168398bcd6-kube-api-access-g9xpt\") on node \"crc\" DevicePath \"\"" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.377100 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.378223 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "132d342c-14e7-4cf5-a57b-ec168398bcd6" (UID: "132d342c-14e7-4cf5-a57b-ec168398bcd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.478782 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132d342c-14e7-4cf5-a57b-ec168398bcd6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.513785 4721 generic.go:334] "Generic (PLEG): container finished" podID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerID="13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf" exitCode=0 Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.513826 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6b8c" event={"ID":"132d342c-14e7-4cf5-a57b-ec168398bcd6","Type":"ContainerDied","Data":"13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf"} Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.513851 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s6b8c" event={"ID":"132d342c-14e7-4cf5-a57b-ec168398bcd6","Type":"ContainerDied","Data":"9e948eff55381bb46ecc98162f082b8ff09ac0413fd3e4474bf9ea804dd01688"} Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.513866 4721 scope.go:117] "RemoveContainer" containerID="13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.513987 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s6b8c" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.536434 4721 scope.go:117] "RemoveContainer" containerID="be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.548200 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s6b8c"] Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.557692 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s6b8c"] Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.561438 4721 scope.go:117] "RemoveContainer" containerID="a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.606507 4721 scope.go:117] "RemoveContainer" containerID="13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf" Feb 02 14:07:28 crc kubenswrapper[4721]: E0202 14:07:28.607011 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf\": container with ID starting with 13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf not found: ID does not exist" containerID="13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.607054 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf"} err="failed to get container status \"13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf\": rpc error: code = NotFound desc = could not find container \"13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf\": container with ID starting with 13c7a2c5f7de51349240cfcd6a1e2310a3a001542a18161adbb2b46b9d27bedf not found: ID does not exist" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.607099 4721 scope.go:117] "RemoveContainer" containerID="be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3" Feb 02 14:07:28 crc kubenswrapper[4721]: E0202 14:07:28.607355 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3\": container with ID starting with be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3 not found: ID does not exist" containerID="be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.607380 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3"} err="failed to get container status \"be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3\": rpc error: code = NotFound desc = could not find container \"be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3\": container with ID starting with be00503c8dac0091b8b485e8ce6e48268fac83904056717e35c733050f51c4c3 not found: ID does not exist" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.607392 4721 scope.go:117] "RemoveContainer" containerID="a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c" Feb 02 14:07:28 crc kubenswrapper[4721]: E0202 14:07:28.607707 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c\": container with ID starting with a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c not found: ID does not exist" containerID="a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c" Feb 02 14:07:28 crc kubenswrapper[4721]: I0202 14:07:28.607734 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c"} err="failed to get container status \"a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c\": rpc error: code = NotFound desc = could not find container \"a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c\": container with ID starting with a1e1933fddf132b29cc798adc427aff4954d9bb865810605d5b7733c86a3c42c not found: ID does not exist" Feb 02 14:07:30 crc kubenswrapper[4721]: I0202 14:07:30.422624 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132d342c-14e7-4cf5-a57b-ec168398bcd6" path="/var/lib/kubelet/pods/132d342c-14e7-4cf5-a57b-ec168398bcd6/volumes" Feb 02 14:07:31 crc kubenswrapper[4721]: I0202 14:07:31.410569 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:07:31 crc kubenswrapper[4721]: E0202 14:07:31.410834 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:07:42 crc kubenswrapper[4721]: I0202 14:07:42.410239 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:07:42 crc kubenswrapper[4721]: E0202 14:07:42.411278 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:07:56 crc kubenswrapper[4721]: I0202 14:07:56.409765 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:07:56 crc kubenswrapper[4721]: E0202 14:07:56.410501 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.063884 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hqfsp"] Feb 02 14:08:04 crc kubenswrapper[4721]: E0202 14:08:04.065012 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerName="extract-content" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.065040 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerName="extract-content" Feb 02 14:08:04 crc kubenswrapper[4721]: E0202 14:08:04.065059 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerName="extract-utilities" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.065076 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerName="extract-utilities" Feb 02 14:08:04 crc kubenswrapper[4721]: E0202 14:08:04.065115 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerName="registry-server" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.065122 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerName="registry-server" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.065359 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="132d342c-14e7-4cf5-a57b-ec168398bcd6" containerName="registry-server" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.067415 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.090680 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqfsp"] Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.139718 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frcgn\" (UniqueName: \"kubernetes.io/projected/41332d77-5523-4863-90fd-84ef4bd024dc-kube-api-access-frcgn\") pod \"redhat-operators-hqfsp\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.139937 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-utilities\") pod \"redhat-operators-hqfsp\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.140047 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-catalog-content\") pod \"redhat-operators-hqfsp\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.242831 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-utilities\") pod \"redhat-operators-hqfsp\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.242906 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-catalog-content\") pod \"redhat-operators-hqfsp\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.243053 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frcgn\" (UniqueName: \"kubernetes.io/projected/41332d77-5523-4863-90fd-84ef4bd024dc-kube-api-access-frcgn\") pod \"redhat-operators-hqfsp\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.243338 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-utilities\") pod \"redhat-operators-hqfsp\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.243393 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-catalog-content\") pod \"redhat-operators-hqfsp\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.263806 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frcgn\" (UniqueName: \"kubernetes.io/projected/41332d77-5523-4863-90fd-84ef4bd024dc-kube-api-access-frcgn\") pod \"redhat-operators-hqfsp\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.399353 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:04 crc kubenswrapper[4721]: I0202 14:08:04.983700 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqfsp"] Feb 02 14:08:05 crc kubenswrapper[4721]: I0202 14:08:05.905493 4721 generic.go:334] "Generic (PLEG): container finished" podID="41332d77-5523-4863-90fd-84ef4bd024dc" containerID="d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579" exitCode=0 Feb 02 14:08:05 crc kubenswrapper[4721]: I0202 14:08:05.905548 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqfsp" event={"ID":"41332d77-5523-4863-90fd-84ef4bd024dc","Type":"ContainerDied","Data":"d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579"} Feb 02 14:08:05 crc kubenswrapper[4721]: I0202 14:08:05.906104 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqfsp" event={"ID":"41332d77-5523-4863-90fd-84ef4bd024dc","Type":"ContainerStarted","Data":"6c07071d78da47bcb155fe49420a5a01c4cbee1e05c6f701f0e8d4bb5368342f"} Feb 02 14:08:07 crc kubenswrapper[4721]: I0202 14:08:07.928447 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqfsp" event={"ID":"41332d77-5523-4863-90fd-84ef4bd024dc","Type":"ContainerStarted","Data":"90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733"} Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.051282 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jnl2q"] Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.054770 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.075409 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jnl2q"] Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.154611 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-utilities\") pod \"certified-operators-jnl2q\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.155054 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cd9w\" (UniqueName: \"kubernetes.io/projected/7e27767b-809c-4392-aec0-a3e3d50959fb-kube-api-access-5cd9w\") pod \"certified-operators-jnl2q\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.155283 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-catalog-content\") pod \"certified-operators-jnl2q\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.257871 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cd9w\" (UniqueName: \"kubernetes.io/projected/7e27767b-809c-4392-aec0-a3e3d50959fb-kube-api-access-5cd9w\") pod \"certified-operators-jnl2q\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.258334 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-catalog-content\") pod \"certified-operators-jnl2q\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.258564 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-utilities\") pod \"certified-operators-jnl2q\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.258882 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-catalog-content\") pod \"certified-operators-jnl2q\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.259063 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-utilities\") pod \"certified-operators-jnl2q\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.277323 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cd9w\" (UniqueName: \"kubernetes.io/projected/7e27767b-809c-4392-aec0-a3e3d50959fb-kube-api-access-5cd9w\") pod \"certified-operators-jnl2q\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:08 crc kubenswrapper[4721]: I0202 14:08:08.384541 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:09 crc kubenswrapper[4721]: I0202 14:08:09.063872 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jnl2q"] Feb 02 14:08:09 crc kubenswrapper[4721]: W0202 14:08:09.475917 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e27767b_809c_4392_aec0_a3e3d50959fb.slice/crio-a80f37e1964f3e1ad813d5e6eeead9f759f231490559155985da437ef23d10c2 WatchSource:0}: Error finding container a80f37e1964f3e1ad813d5e6eeead9f759f231490559155985da437ef23d10c2: Status 404 returned error can't find the container with id a80f37e1964f3e1ad813d5e6eeead9f759f231490559155985da437ef23d10c2 Feb 02 14:08:09 crc kubenswrapper[4721]: I0202 14:08:09.956788 4721 generic.go:334] "Generic (PLEG): container finished" podID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerID="a4bba1e22fb3a442494acc735e394cf429cffe943b51c57d84263aa5f3758e98" exitCode=0 Feb 02 14:08:09 crc kubenswrapper[4721]: I0202 14:08:09.956860 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnl2q" event={"ID":"7e27767b-809c-4392-aec0-a3e3d50959fb","Type":"ContainerDied","Data":"a4bba1e22fb3a442494acc735e394cf429cffe943b51c57d84263aa5f3758e98"} Feb 02 14:08:09 crc kubenswrapper[4721]: I0202 14:08:09.957099 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnl2q" event={"ID":"7e27767b-809c-4392-aec0-a3e3d50959fb","Type":"ContainerStarted","Data":"a80f37e1964f3e1ad813d5e6eeead9f759f231490559155985da437ef23d10c2"} Feb 02 14:08:10 crc kubenswrapper[4721]: I0202 14:08:10.970035 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnl2q" event={"ID":"7e27767b-809c-4392-aec0-a3e3d50959fb","Type":"ContainerStarted","Data":"b5da3211ab2c81abf51700d6d4107828beceaba7bc40e9a77c6cea71c7975317"} Feb 02 14:08:11 crc kubenswrapper[4721]: I0202 14:08:11.409811 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:08:11 crc kubenswrapper[4721]: E0202 14:08:11.410660 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:08:16 crc kubenswrapper[4721]: I0202 14:08:16.027781 4721 generic.go:334] "Generic (PLEG): container finished" podID="41332d77-5523-4863-90fd-84ef4bd024dc" containerID="90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733" exitCode=0 Feb 02 14:08:16 crc kubenswrapper[4721]: I0202 14:08:16.027877 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqfsp" event={"ID":"41332d77-5523-4863-90fd-84ef4bd024dc","Type":"ContainerDied","Data":"90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733"} Feb 02 14:08:16 crc kubenswrapper[4721]: I0202 14:08:16.031025 4721 generic.go:334] "Generic (PLEG): container finished" podID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerID="b5da3211ab2c81abf51700d6d4107828beceaba7bc40e9a77c6cea71c7975317" exitCode=0 Feb 02 14:08:16 crc kubenswrapper[4721]: I0202 14:08:16.031060 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnl2q" event={"ID":"7e27767b-809c-4392-aec0-a3e3d50959fb","Type":"ContainerDied","Data":"b5da3211ab2c81abf51700d6d4107828beceaba7bc40e9a77c6cea71c7975317"} Feb 02 14:08:18 crc kubenswrapper[4721]: I0202 14:08:18.056958 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnl2q" event={"ID":"7e27767b-809c-4392-aec0-a3e3d50959fb","Type":"ContainerStarted","Data":"e4a5b4f720a64cea1dfa855a40701a8879af0576f1d55c540e45a5059f322778"} Feb 02 14:08:18 crc kubenswrapper[4721]: I0202 14:08:18.062530 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqfsp" event={"ID":"41332d77-5523-4863-90fd-84ef4bd024dc","Type":"ContainerStarted","Data":"f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8"} Feb 02 14:08:18 crc kubenswrapper[4721]: I0202 14:08:18.084088 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jnl2q" podStartSLOduration=2.613649991 podStartE2EDuration="10.084048654s" podCreationTimestamp="2026-02-02 14:08:08 +0000 UTC" firstStartedPulling="2026-02-02 14:08:09.958613835 +0000 UTC m=+4030.261128224" lastFinishedPulling="2026-02-02 14:08:17.429012498 +0000 UTC m=+4037.731526887" observedRunningTime="2026-02-02 14:08:18.073635793 +0000 UTC m=+4038.376150202" watchObservedRunningTime="2026-02-02 14:08:18.084048654 +0000 UTC m=+4038.386563043" Feb 02 14:08:18 crc kubenswrapper[4721]: I0202 14:08:18.102018 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hqfsp" podStartSLOduration=2.489914928 podStartE2EDuration="14.101972148s" podCreationTimestamp="2026-02-02 14:08:04 +0000 UTC" firstStartedPulling="2026-02-02 14:08:05.907942795 +0000 UTC m=+4026.210457204" lastFinishedPulling="2026-02-02 14:08:17.520000025 +0000 UTC m=+4037.822514424" observedRunningTime="2026-02-02 14:08:18.09094558 +0000 UTC m=+4038.393459989" watchObservedRunningTime="2026-02-02 14:08:18.101972148 +0000 UTC m=+4038.404486537" Feb 02 14:08:18 crc kubenswrapper[4721]: I0202 14:08:18.385421 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:18 crc kubenswrapper[4721]: I0202 14:08:18.385877 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:19 crc kubenswrapper[4721]: I0202 14:08:19.449315 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jnl2q" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerName="registry-server" probeResult="failure" output=< Feb 02 14:08:19 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:08:19 crc kubenswrapper[4721]: > Feb 02 14:08:23 crc kubenswrapper[4721]: I0202 14:08:23.410331 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:08:23 crc kubenswrapper[4721]: E0202 14:08:23.411483 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:08:24 crc kubenswrapper[4721]: I0202 14:08:24.400376 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:24 crc kubenswrapper[4721]: I0202 14:08:24.400429 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:25 crc kubenswrapper[4721]: I0202 14:08:25.447591 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqfsp" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="registry-server" probeResult="failure" output=< Feb 02 14:08:25 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:08:25 crc kubenswrapper[4721]: > Feb 02 14:08:28 crc kubenswrapper[4721]: I0202 14:08:28.451162 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:28 crc kubenswrapper[4721]: I0202 14:08:28.514016 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:28 crc kubenswrapper[4721]: I0202 14:08:28.713096 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jnl2q"] Feb 02 14:08:30 crc kubenswrapper[4721]: I0202 14:08:30.177093 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jnl2q" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerName="registry-server" containerID="cri-o://e4a5b4f720a64cea1dfa855a40701a8879af0576f1d55c540e45a5059f322778" gracePeriod=2 Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.191000 4721 generic.go:334] "Generic (PLEG): container finished" podID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerID="e4a5b4f720a64cea1dfa855a40701a8879af0576f1d55c540e45a5059f322778" exitCode=0 Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.191044 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnl2q" event={"ID":"7e27767b-809c-4392-aec0-a3e3d50959fb","Type":"ContainerDied","Data":"e4a5b4f720a64cea1dfa855a40701a8879af0576f1d55c540e45a5059f322778"} Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.605213 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.654899 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-utilities\") pod \"7e27767b-809c-4392-aec0-a3e3d50959fb\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.655670 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cd9w\" (UniqueName: \"kubernetes.io/projected/7e27767b-809c-4392-aec0-a3e3d50959fb-kube-api-access-5cd9w\") pod \"7e27767b-809c-4392-aec0-a3e3d50959fb\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.655778 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-catalog-content\") pod \"7e27767b-809c-4392-aec0-a3e3d50959fb\" (UID: \"7e27767b-809c-4392-aec0-a3e3d50959fb\") " Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.655787 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-utilities" (OuterVolumeSpecName: "utilities") pod "7e27767b-809c-4392-aec0-a3e3d50959fb" (UID: "7e27767b-809c-4392-aec0-a3e3d50959fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.656482 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.661140 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e27767b-809c-4392-aec0-a3e3d50959fb-kube-api-access-5cd9w" (OuterVolumeSpecName: "kube-api-access-5cd9w") pod "7e27767b-809c-4392-aec0-a3e3d50959fb" (UID: "7e27767b-809c-4392-aec0-a3e3d50959fb"). InnerVolumeSpecName "kube-api-access-5cd9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.711123 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e27767b-809c-4392-aec0-a3e3d50959fb" (UID: "7e27767b-809c-4392-aec0-a3e3d50959fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.760249 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cd9w\" (UniqueName: \"kubernetes.io/projected/7e27767b-809c-4392-aec0-a3e3d50959fb-kube-api-access-5cd9w\") on node \"crc\" DevicePath \"\"" Feb 02 14:08:31 crc kubenswrapper[4721]: I0202 14:08:31.760302 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e27767b-809c-4392-aec0-a3e3d50959fb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:08:32 crc kubenswrapper[4721]: I0202 14:08:32.204807 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jnl2q" event={"ID":"7e27767b-809c-4392-aec0-a3e3d50959fb","Type":"ContainerDied","Data":"a80f37e1964f3e1ad813d5e6eeead9f759f231490559155985da437ef23d10c2"} Feb 02 14:08:32 crc kubenswrapper[4721]: I0202 14:08:32.204866 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jnl2q" Feb 02 14:08:32 crc kubenswrapper[4721]: I0202 14:08:32.204881 4721 scope.go:117] "RemoveContainer" containerID="e4a5b4f720a64cea1dfa855a40701a8879af0576f1d55c540e45a5059f322778" Feb 02 14:08:32 crc kubenswrapper[4721]: I0202 14:08:32.251529 4721 scope.go:117] "RemoveContainer" containerID="b5da3211ab2c81abf51700d6d4107828beceaba7bc40e9a77c6cea71c7975317" Feb 02 14:08:32 crc kubenswrapper[4721]: I0202 14:08:32.253024 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jnl2q"] Feb 02 14:08:32 crc kubenswrapper[4721]: I0202 14:08:32.267528 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jnl2q"] Feb 02 14:08:32 crc kubenswrapper[4721]: I0202 14:08:32.322820 4721 scope.go:117] "RemoveContainer" containerID="a4bba1e22fb3a442494acc735e394cf429cffe943b51c57d84263aa5f3758e98" Feb 02 14:08:32 crc kubenswrapper[4721]: I0202 14:08:32.422265 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" path="/var/lib/kubelet/pods/7e27767b-809c-4392-aec0-a3e3d50959fb/volumes" Feb 02 14:08:35 crc kubenswrapper[4721]: I0202 14:08:35.456115 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqfsp" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="registry-server" probeResult="failure" output=< Feb 02 14:08:35 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:08:35 crc kubenswrapper[4721]: > Feb 02 14:08:36 crc kubenswrapper[4721]: I0202 14:08:36.409989 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:08:36 crc kubenswrapper[4721]: E0202 14:08:36.410595 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:08:45 crc kubenswrapper[4721]: I0202 14:08:45.901638 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hqfsp" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="registry-server" probeResult="failure" output=< Feb 02 14:08:45 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:08:45 crc kubenswrapper[4721]: > Feb 02 14:08:47 crc kubenswrapper[4721]: I0202 14:08:47.410337 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:08:47 crc kubenswrapper[4721]: E0202 14:08:47.410967 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:08:54 crc kubenswrapper[4721]: I0202 14:08:54.451705 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:54 crc kubenswrapper[4721]: I0202 14:08:54.519166 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:54 crc kubenswrapper[4721]: I0202 14:08:54.693434 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqfsp"] Feb 02 14:08:56 crc kubenswrapper[4721]: I0202 14:08:56.442162 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hqfsp" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="registry-server" containerID="cri-o://f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8" gracePeriod=2 Feb 02 14:08:56 crc kubenswrapper[4721]: I0202 14:08:56.947862 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.003027 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frcgn\" (UniqueName: \"kubernetes.io/projected/41332d77-5523-4863-90fd-84ef4bd024dc-kube-api-access-frcgn\") pod \"41332d77-5523-4863-90fd-84ef4bd024dc\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.003404 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-catalog-content\") pod \"41332d77-5523-4863-90fd-84ef4bd024dc\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.003670 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-utilities\") pod \"41332d77-5523-4863-90fd-84ef4bd024dc\" (UID: \"41332d77-5523-4863-90fd-84ef4bd024dc\") " Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.004589 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-utilities" (OuterVolumeSpecName: "utilities") pod "41332d77-5523-4863-90fd-84ef4bd024dc" (UID: "41332d77-5523-4863-90fd-84ef4bd024dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.004909 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.012271 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41332d77-5523-4863-90fd-84ef4bd024dc-kube-api-access-frcgn" (OuterVolumeSpecName: "kube-api-access-frcgn") pod "41332d77-5523-4863-90fd-84ef4bd024dc" (UID: "41332d77-5523-4863-90fd-84ef4bd024dc"). InnerVolumeSpecName "kube-api-access-frcgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.107187 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frcgn\" (UniqueName: \"kubernetes.io/projected/41332d77-5523-4863-90fd-84ef4bd024dc-kube-api-access-frcgn\") on node \"crc\" DevicePath \"\"" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.133481 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41332d77-5523-4863-90fd-84ef4bd024dc" (UID: "41332d77-5523-4863-90fd-84ef4bd024dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.209861 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41332d77-5523-4863-90fd-84ef4bd024dc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.454286 4721 generic.go:334] "Generic (PLEG): container finished" podID="41332d77-5523-4863-90fd-84ef4bd024dc" containerID="f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8" exitCode=0 Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.454331 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqfsp" event={"ID":"41332d77-5523-4863-90fd-84ef4bd024dc","Type":"ContainerDied","Data":"f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8"} Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.454369 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqfsp" event={"ID":"41332d77-5523-4863-90fd-84ef4bd024dc","Type":"ContainerDied","Data":"6c07071d78da47bcb155fe49420a5a01c4cbee1e05c6f701f0e8d4bb5368342f"} Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.454374 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqfsp" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.454392 4721 scope.go:117] "RemoveContainer" containerID="f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.481181 4721 scope.go:117] "RemoveContainer" containerID="90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.502617 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqfsp"] Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.530576 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hqfsp"] Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.542338 4721 scope.go:117] "RemoveContainer" containerID="d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.582811 4721 scope.go:117] "RemoveContainer" containerID="f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8" Feb 02 14:08:57 crc kubenswrapper[4721]: E0202 14:08:57.583288 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8\": container with ID starting with f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8 not found: ID does not exist" containerID="f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.583325 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8"} err="failed to get container status \"f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8\": rpc error: code = NotFound desc = could not find container \"f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8\": container with ID starting with f942f9a01862da049be32562bbba8ea52e535b48156529cd0268d2469ccc00b8 not found: ID does not exist" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.583348 4721 scope.go:117] "RemoveContainer" containerID="90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733" Feb 02 14:08:57 crc kubenswrapper[4721]: E0202 14:08:57.583806 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733\": container with ID starting with 90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733 not found: ID does not exist" containerID="90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.583854 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733"} err="failed to get container status \"90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733\": rpc error: code = NotFound desc = could not find container \"90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733\": container with ID starting with 90ef5823e28ad3753a59c90c22174f8e739b207f7258218e3047dbd87eadc733 not found: ID does not exist" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.583887 4721 scope.go:117] "RemoveContainer" containerID="d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579" Feb 02 14:08:57 crc kubenswrapper[4721]: E0202 14:08:57.584312 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579\": container with ID starting with d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579 not found: ID does not exist" containerID="d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579" Feb 02 14:08:57 crc kubenswrapper[4721]: I0202 14:08:57.584347 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579"} err="failed to get container status \"d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579\": rpc error: code = NotFound desc = could not find container \"d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579\": container with ID starting with d74cc09c503748e4dfd431417e851a6d44e9f0ec667431f7f10124af0c9ae579 not found: ID does not exist" Feb 02 14:08:58 crc kubenswrapper[4721]: I0202 14:08:58.430866 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" path="/var/lib/kubelet/pods/41332d77-5523-4863-90fd-84ef4bd024dc/volumes" Feb 02 14:09:00 crc kubenswrapper[4721]: I0202 14:09:00.416472 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:09:00 crc kubenswrapper[4721]: E0202 14:09:00.417100 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:09:14 crc kubenswrapper[4721]: I0202 14:09:14.411157 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:09:14 crc kubenswrapper[4721]: E0202 14:09:14.413538 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.116714 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6jl2c"] Feb 02 14:09:18 crc kubenswrapper[4721]: E0202 14:09:18.118190 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerName="extract-content" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.118216 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerName="extract-content" Feb 02 14:09:18 crc kubenswrapper[4721]: E0202 14:09:18.118261 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="extract-utilities" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.118276 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="extract-utilities" Feb 02 14:09:18 crc kubenswrapper[4721]: E0202 14:09:18.118298 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerName="registry-server" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.118311 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerName="registry-server" Feb 02 14:09:18 crc kubenswrapper[4721]: E0202 14:09:18.118334 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="extract-content" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.118345 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="extract-content" Feb 02 14:09:18 crc kubenswrapper[4721]: E0202 14:09:18.118369 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerName="extract-utilities" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.118380 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerName="extract-utilities" Feb 02 14:09:18 crc kubenswrapper[4721]: E0202 14:09:18.118406 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="registry-server" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.118419 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="registry-server" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.118801 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="41332d77-5523-4863-90fd-84ef4bd024dc" containerName="registry-server" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.118830 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e27767b-809c-4392-aec0-a3e3d50959fb" containerName="registry-server" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.121644 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.125350 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-catalog-content\") pod \"redhat-marketplace-6jl2c\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.125603 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xkxx\" (UniqueName: \"kubernetes.io/projected/124e4063-2c55-4307-9d6f-8e9f776a994f-kube-api-access-2xkxx\") pod \"redhat-marketplace-6jl2c\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.125741 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-utilities\") pod \"redhat-marketplace-6jl2c\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.126192 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jl2c"] Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.227510 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-catalog-content\") pod \"redhat-marketplace-6jl2c\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.227648 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xkxx\" (UniqueName: \"kubernetes.io/projected/124e4063-2c55-4307-9d6f-8e9f776a994f-kube-api-access-2xkxx\") pod \"redhat-marketplace-6jl2c\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.227718 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-utilities\") pod \"redhat-marketplace-6jl2c\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.228119 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-catalog-content\") pod \"redhat-marketplace-6jl2c\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.228249 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-utilities\") pod \"redhat-marketplace-6jl2c\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.265452 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xkxx\" (UniqueName: \"kubernetes.io/projected/124e4063-2c55-4307-9d6f-8e9f776a994f-kube-api-access-2xkxx\") pod \"redhat-marketplace-6jl2c\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.455117 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:18 crc kubenswrapper[4721]: I0202 14:09:18.988872 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jl2c"] Feb 02 14:09:19 crc kubenswrapper[4721]: I0202 14:09:19.681254 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jl2c" event={"ID":"124e4063-2c55-4307-9d6f-8e9f776a994f","Type":"ContainerStarted","Data":"46db6615e27a562cfbe68ae9f7728fbd5e8ecdf926c7ecc60b506a4cfc3648cf"} Feb 02 14:09:20 crc kubenswrapper[4721]: I0202 14:09:20.691734 4721 generic.go:334] "Generic (PLEG): container finished" podID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerID="5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e" exitCode=0 Feb 02 14:09:20 crc kubenswrapper[4721]: I0202 14:09:20.692256 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jl2c" event={"ID":"124e4063-2c55-4307-9d6f-8e9f776a994f","Type":"ContainerDied","Data":"5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e"} Feb 02 14:09:21 crc kubenswrapper[4721]: I0202 14:09:21.705856 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jl2c" event={"ID":"124e4063-2c55-4307-9d6f-8e9f776a994f","Type":"ContainerStarted","Data":"130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d"} Feb 02 14:09:22 crc kubenswrapper[4721]: I0202 14:09:22.726512 4721 generic.go:334] "Generic (PLEG): container finished" podID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerID="130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d" exitCode=0 Feb 02 14:09:22 crc kubenswrapper[4721]: I0202 14:09:22.726588 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jl2c" event={"ID":"124e4063-2c55-4307-9d6f-8e9f776a994f","Type":"ContainerDied","Data":"130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d"} Feb 02 14:09:23 crc kubenswrapper[4721]: I0202 14:09:23.739292 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jl2c" event={"ID":"124e4063-2c55-4307-9d6f-8e9f776a994f","Type":"ContainerStarted","Data":"c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41"} Feb 02 14:09:23 crc kubenswrapper[4721]: I0202 14:09:23.783888 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6jl2c" podStartSLOduration=3.375133914 podStartE2EDuration="5.783864062s" podCreationTimestamp="2026-02-02 14:09:18 +0000 UTC" firstStartedPulling="2026-02-02 14:09:20.694568529 +0000 UTC m=+4100.997082918" lastFinishedPulling="2026-02-02 14:09:23.103298687 +0000 UTC m=+4103.405813066" observedRunningTime="2026-02-02 14:09:23.755540647 +0000 UTC m=+4104.058055036" watchObservedRunningTime="2026-02-02 14:09:23.783864062 +0000 UTC m=+4104.086378441" Feb 02 14:09:28 crc kubenswrapper[4721]: I0202 14:09:28.455775 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:28 crc kubenswrapper[4721]: I0202 14:09:28.456175 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:28 crc kubenswrapper[4721]: I0202 14:09:28.513775 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:28 crc kubenswrapper[4721]: I0202 14:09:28.851012 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:28 crc kubenswrapper[4721]: I0202 14:09:28.909131 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jl2c"] Feb 02 14:09:29 crc kubenswrapper[4721]: I0202 14:09:29.409664 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:09:29 crc kubenswrapper[4721]: E0202 14:09:29.410380 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:09:30 crc kubenswrapper[4721]: I0202 14:09:30.819021 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6jl2c" podUID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerName="registry-server" containerID="cri-o://c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41" gracePeriod=2 Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.396633 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.557218 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-catalog-content\") pod \"124e4063-2c55-4307-9d6f-8e9f776a994f\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.557559 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-utilities\") pod \"124e4063-2c55-4307-9d6f-8e9f776a994f\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.557622 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xkxx\" (UniqueName: \"kubernetes.io/projected/124e4063-2c55-4307-9d6f-8e9f776a994f-kube-api-access-2xkxx\") pod \"124e4063-2c55-4307-9d6f-8e9f776a994f\" (UID: \"124e4063-2c55-4307-9d6f-8e9f776a994f\") " Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.558906 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-utilities" (OuterVolumeSpecName: "utilities") pod "124e4063-2c55-4307-9d6f-8e9f776a994f" (UID: "124e4063-2c55-4307-9d6f-8e9f776a994f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.575375 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/124e4063-2c55-4307-9d6f-8e9f776a994f-kube-api-access-2xkxx" (OuterVolumeSpecName: "kube-api-access-2xkxx") pod "124e4063-2c55-4307-9d6f-8e9f776a994f" (UID: "124e4063-2c55-4307-9d6f-8e9f776a994f"). InnerVolumeSpecName "kube-api-access-2xkxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.594216 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "124e4063-2c55-4307-9d6f-8e9f776a994f" (UID: "124e4063-2c55-4307-9d6f-8e9f776a994f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.659604 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.659635 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/124e4063-2c55-4307-9d6f-8e9f776a994f-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.659645 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xkxx\" (UniqueName: \"kubernetes.io/projected/124e4063-2c55-4307-9d6f-8e9f776a994f-kube-api-access-2xkxx\") on node \"crc\" DevicePath \"\"" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.831602 4721 generic.go:334] "Generic (PLEG): container finished" podID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerID="c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41" exitCode=0 Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.831674 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jl2c" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.832161 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jl2c" event={"ID":"124e4063-2c55-4307-9d6f-8e9f776a994f","Type":"ContainerDied","Data":"c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41"} Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.835098 4721 scope.go:117] "RemoveContainer" containerID="c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.835018 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jl2c" event={"ID":"124e4063-2c55-4307-9d6f-8e9f776a994f","Type":"ContainerDied","Data":"46db6615e27a562cfbe68ae9f7728fbd5e8ecdf926c7ecc60b506a4cfc3648cf"} Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.858932 4721 scope.go:117] "RemoveContainer" containerID="130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.881108 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jl2c"] Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.893362 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jl2c"] Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.901220 4721 scope.go:117] "RemoveContainer" containerID="5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.946838 4721 scope.go:117] "RemoveContainer" containerID="c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41" Feb 02 14:09:31 crc kubenswrapper[4721]: E0202 14:09:31.947526 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41\": container with ID starting with c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41 not found: ID does not exist" containerID="c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.947565 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41"} err="failed to get container status \"c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41\": rpc error: code = NotFound desc = could not find container \"c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41\": container with ID starting with c44eb14cba4af2787a2a4a196d4a91f350001847fe8980913c05863545f8fe41 not found: ID does not exist" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.947591 4721 scope.go:117] "RemoveContainer" containerID="130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d" Feb 02 14:09:31 crc kubenswrapper[4721]: E0202 14:09:31.948014 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d\": container with ID starting with 130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d not found: ID does not exist" containerID="130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.948151 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d"} err="failed to get container status \"130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d\": rpc error: code = NotFound desc = could not find container \"130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d\": container with ID starting with 130a4ac91634cb769f646163be7809f560730b1d14eca045d2fff194f2e1d37d not found: ID does not exist" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.948248 4721 scope.go:117] "RemoveContainer" containerID="5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e" Feb 02 14:09:31 crc kubenswrapper[4721]: E0202 14:09:31.948647 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e\": container with ID starting with 5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e not found: ID does not exist" containerID="5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e" Feb 02 14:09:31 crc kubenswrapper[4721]: I0202 14:09:31.948702 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e"} err="failed to get container status \"5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e\": rpc error: code = NotFound desc = could not find container \"5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e\": container with ID starting with 5610d8b090cbb2b0bfe52ec49beb490a7d7ac634ccbe498d94376068a10d066e not found: ID does not exist" Feb 02 14:09:32 crc kubenswrapper[4721]: I0202 14:09:32.442972 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="124e4063-2c55-4307-9d6f-8e9f776a994f" path="/var/lib/kubelet/pods/124e4063-2c55-4307-9d6f-8e9f776a994f/volumes" Feb 02 14:09:44 crc kubenswrapper[4721]: I0202 14:09:44.410333 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:09:44 crc kubenswrapper[4721]: E0202 14:09:44.411649 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:09:59 crc kubenswrapper[4721]: I0202 14:09:59.410662 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:09:59 crc kubenswrapper[4721]: E0202 14:09:59.411744 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:10:12 crc kubenswrapper[4721]: I0202 14:10:12.411664 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:10:12 crc kubenswrapper[4721]: E0202 14:10:12.414503 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:10:25 crc kubenswrapper[4721]: I0202 14:10:25.410977 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:10:26 crc kubenswrapper[4721]: I0202 14:10:26.425095 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"814c7c146f4891242cef124426d3e01f7b0766cdf2128bc2a21fd29b8f9fdeb0"} Feb 02 14:12:44 crc kubenswrapper[4721]: I0202 14:12:44.763549 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:12:44 crc kubenswrapper[4721]: I0202 14:12:44.764184 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:13:14 crc kubenswrapper[4721]: I0202 14:13:14.763646 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:13:14 crc kubenswrapper[4721]: I0202 14:13:14.764283 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:13:44 crc kubenswrapper[4721]: I0202 14:13:44.763343 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:13:44 crc kubenswrapper[4721]: I0202 14:13:44.763947 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:13:44 crc kubenswrapper[4721]: I0202 14:13:44.763999 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 14:13:44 crc kubenswrapper[4721]: I0202 14:13:44.764999 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"814c7c146f4891242cef124426d3e01f7b0766cdf2128bc2a21fd29b8f9fdeb0"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 14:13:44 crc kubenswrapper[4721]: I0202 14:13:44.765067 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://814c7c146f4891242cef124426d3e01f7b0766cdf2128bc2a21fd29b8f9fdeb0" gracePeriod=600 Feb 02 14:13:45 crc kubenswrapper[4721]: I0202 14:13:45.409958 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="814c7c146f4891242cef124426d3e01f7b0766cdf2128bc2a21fd29b8f9fdeb0" exitCode=0 Feb 02 14:13:45 crc kubenswrapper[4721]: I0202 14:13:45.409992 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"814c7c146f4891242cef124426d3e01f7b0766cdf2128bc2a21fd29b8f9fdeb0"} Feb 02 14:13:45 crc kubenswrapper[4721]: I0202 14:13:45.410562 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd"} Feb 02 14:13:45 crc kubenswrapper[4721]: I0202 14:13:45.410587 4721 scope.go:117] "RemoveContainer" containerID="5cb4b1f20ddb60a671585bed29dfb5cdee5b7a5164771c89413e46dd4b8502d3" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.372635 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5"] Feb 02 14:15:00 crc kubenswrapper[4721]: E0202 14:15:00.373588 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerName="extract-utilities" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.373609 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerName="extract-utilities" Feb 02 14:15:00 crc kubenswrapper[4721]: E0202 14:15:00.373664 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerName="extract-content" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.373672 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerName="extract-content" Feb 02 14:15:00 crc kubenswrapper[4721]: E0202 14:15:00.373691 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerName="registry-server" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.373700 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerName="registry-server" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.373958 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="124e4063-2c55-4307-9d6f-8e9f776a994f" containerName="registry-server" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.374788 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.377003 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.383207 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5"] Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.383265 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.485666 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea67d838-a947-4428-a7a0-f1a8484eadf1-config-volume\") pod \"collect-profiles-29500695-2dpd5\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.486417 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rq6h\" (UniqueName: \"kubernetes.io/projected/ea67d838-a947-4428-a7a0-f1a8484eadf1-kube-api-access-7rq6h\") pod \"collect-profiles-29500695-2dpd5\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.487319 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea67d838-a947-4428-a7a0-f1a8484eadf1-secret-volume\") pod \"collect-profiles-29500695-2dpd5\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.590607 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea67d838-a947-4428-a7a0-f1a8484eadf1-config-volume\") pod \"collect-profiles-29500695-2dpd5\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.590769 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rq6h\" (UniqueName: \"kubernetes.io/projected/ea67d838-a947-4428-a7a0-f1a8484eadf1-kube-api-access-7rq6h\") pod \"collect-profiles-29500695-2dpd5\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.590848 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea67d838-a947-4428-a7a0-f1a8484eadf1-secret-volume\") pod \"collect-profiles-29500695-2dpd5\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.592758 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.599344 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea67d838-a947-4428-a7a0-f1a8484eadf1-secret-volume\") pod \"collect-profiles-29500695-2dpd5\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.601871 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea67d838-a947-4428-a7a0-f1a8484eadf1-config-volume\") pod \"collect-profiles-29500695-2dpd5\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.610912 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rq6h\" (UniqueName: \"kubernetes.io/projected/ea67d838-a947-4428-a7a0-f1a8484eadf1-kube-api-access-7rq6h\") pod \"collect-profiles-29500695-2dpd5\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.693883 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 14:15:00 crc kubenswrapper[4721]: I0202 14:15:00.702553 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:01 crc kubenswrapper[4721]: I0202 14:15:01.240295 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5"] Feb 02 14:15:02 crc kubenswrapper[4721]: I0202 14:15:02.234094 4721 generic.go:334] "Generic (PLEG): container finished" podID="ea67d838-a947-4428-a7a0-f1a8484eadf1" containerID="166786c8314d83bae6bc053fd1722409d56df69d5b79d468cc598d016734d7d3" exitCode=0 Feb 02 14:15:02 crc kubenswrapper[4721]: I0202 14:15:02.234593 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" event={"ID":"ea67d838-a947-4428-a7a0-f1a8484eadf1","Type":"ContainerDied","Data":"166786c8314d83bae6bc053fd1722409d56df69d5b79d468cc598d016734d7d3"} Feb 02 14:15:02 crc kubenswrapper[4721]: I0202 14:15:02.235628 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" event={"ID":"ea67d838-a947-4428-a7a0-f1a8484eadf1","Type":"ContainerStarted","Data":"237e8b3522a50bd3200fd69a2fe9dd9469f97b50e6a3f2e0c242ea8f58c767e7"} Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.641388 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.770239 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rq6h\" (UniqueName: \"kubernetes.io/projected/ea67d838-a947-4428-a7a0-f1a8484eadf1-kube-api-access-7rq6h\") pod \"ea67d838-a947-4428-a7a0-f1a8484eadf1\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.770410 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea67d838-a947-4428-a7a0-f1a8484eadf1-secret-volume\") pod \"ea67d838-a947-4428-a7a0-f1a8484eadf1\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.770623 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea67d838-a947-4428-a7a0-f1a8484eadf1-config-volume\") pod \"ea67d838-a947-4428-a7a0-f1a8484eadf1\" (UID: \"ea67d838-a947-4428-a7a0-f1a8484eadf1\") " Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.771305 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea67d838-a947-4428-a7a0-f1a8484eadf1-config-volume" (OuterVolumeSpecName: "config-volume") pod "ea67d838-a947-4428-a7a0-f1a8484eadf1" (UID: "ea67d838-a947-4428-a7a0-f1a8484eadf1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.778165 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea67d838-a947-4428-a7a0-f1a8484eadf1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ea67d838-a947-4428-a7a0-f1a8484eadf1" (UID: "ea67d838-a947-4428-a7a0-f1a8484eadf1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.786552 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea67d838-a947-4428-a7a0-f1a8484eadf1-kube-api-access-7rq6h" (OuterVolumeSpecName: "kube-api-access-7rq6h") pod "ea67d838-a947-4428-a7a0-f1a8484eadf1" (UID: "ea67d838-a947-4428-a7a0-f1a8484eadf1"). InnerVolumeSpecName "kube-api-access-7rq6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.873186 4721 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ea67d838-a947-4428-a7a0-f1a8484eadf1-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.873401 4721 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea67d838-a947-4428-a7a0-f1a8484eadf1-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 14:15:03 crc kubenswrapper[4721]: I0202 14:15:03.873461 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rq6h\" (UniqueName: \"kubernetes.io/projected/ea67d838-a947-4428-a7a0-f1a8484eadf1-kube-api-access-7rq6h\") on node \"crc\" DevicePath \"\"" Feb 02 14:15:04 crc kubenswrapper[4721]: I0202 14:15:04.258626 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" event={"ID":"ea67d838-a947-4428-a7a0-f1a8484eadf1","Type":"ContainerDied","Data":"237e8b3522a50bd3200fd69a2fe9dd9469f97b50e6a3f2e0c242ea8f58c767e7"} Feb 02 14:15:04 crc kubenswrapper[4721]: I0202 14:15:04.258666 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="237e8b3522a50bd3200fd69a2fe9dd9469f97b50e6a3f2e0c242ea8f58c767e7" Feb 02 14:15:04 crc kubenswrapper[4721]: I0202 14:15:04.258689 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500695-2dpd5" Feb 02 14:15:04 crc kubenswrapper[4721]: I0202 14:15:04.724631 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl"] Feb 02 14:15:04 crc kubenswrapper[4721]: I0202 14:15:04.734980 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500650-fclwl"] Feb 02 14:15:06 crc kubenswrapper[4721]: I0202 14:15:06.424355 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19b4436-4c9b-4671-acef-1ba5685cb660" path="/var/lib/kubelet/pods/d19b4436-4c9b-4671-acef-1ba5685cb660/volumes" Feb 02 14:15:14 crc kubenswrapper[4721]: I0202 14:15:14.927340 4721 scope.go:117] "RemoveContainer" containerID="5d8115a3c44a297e5941de9c7ae62ed0d1533603d2bcff7cfc2aadd64924c9b1" Feb 02 14:16:14 crc kubenswrapper[4721]: I0202 14:16:14.763284 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:16:14 crc kubenswrapper[4721]: I0202 14:16:14.764969 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:16:44 crc kubenswrapper[4721]: I0202 14:16:44.763722 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:16:44 crc kubenswrapper[4721]: I0202 14:16:44.764278 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:17:14 crc kubenswrapper[4721]: I0202 14:17:14.763271 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:17:14 crc kubenswrapper[4721]: I0202 14:17:14.763892 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:17:14 crc kubenswrapper[4721]: I0202 14:17:14.763957 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 14:17:14 crc kubenswrapper[4721]: I0202 14:17:14.765042 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 14:17:14 crc kubenswrapper[4721]: I0202 14:17:14.765127 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" gracePeriod=600 Feb 02 14:17:14 crc kubenswrapper[4721]: E0202 14:17:14.888353 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:17:15 crc kubenswrapper[4721]: I0202 14:17:15.620887 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" exitCode=0 Feb 02 14:17:15 crc kubenswrapper[4721]: I0202 14:17:15.620952 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd"} Feb 02 14:17:15 crc kubenswrapper[4721]: I0202 14:17:15.621050 4721 scope.go:117] "RemoveContainer" containerID="814c7c146f4891242cef124426d3e01f7b0766cdf2128bc2a21fd29b8f9fdeb0" Feb 02 14:17:15 crc kubenswrapper[4721]: I0202 14:17:15.621771 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:17:15 crc kubenswrapper[4721]: E0202 14:17:15.622503 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:17:26 crc kubenswrapper[4721]: I0202 14:17:26.410510 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:17:26 crc kubenswrapper[4721]: E0202 14:17:26.411476 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:17:41 crc kubenswrapper[4721]: I0202 14:17:41.409951 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:17:41 crc kubenswrapper[4721]: E0202 14:17:41.410738 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:17:56 crc kubenswrapper[4721]: I0202 14:17:56.410492 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:17:56 crc kubenswrapper[4721]: E0202 14:17:56.411445 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:18:10 crc kubenswrapper[4721]: I0202 14:18:10.417635 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:18:10 crc kubenswrapper[4721]: E0202 14:18:10.418470 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.389343 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xk8wk"] Feb 02 14:18:17 crc kubenswrapper[4721]: E0202 14:18:17.390610 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea67d838-a947-4428-a7a0-f1a8484eadf1" containerName="collect-profiles" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.390628 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea67d838-a947-4428-a7a0-f1a8484eadf1" containerName="collect-profiles" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.390920 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea67d838-a947-4428-a7a0-f1a8484eadf1" containerName="collect-profiles" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.393426 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.404762 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xk8wk"] Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.479541 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-utilities\") pod \"certified-operators-xk8wk\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.479921 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-catalog-content\") pod \"certified-operators-xk8wk\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.480215 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glqf6\" (UniqueName: \"kubernetes.io/projected/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-kube-api-access-glqf6\") pod \"certified-operators-xk8wk\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.583425 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glqf6\" (UniqueName: \"kubernetes.io/projected/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-kube-api-access-glqf6\") pod \"certified-operators-xk8wk\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.583698 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-utilities\") pod \"certified-operators-xk8wk\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.583740 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-catalog-content\") pod \"certified-operators-xk8wk\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.584352 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-utilities\") pod \"certified-operators-xk8wk\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.584707 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-catalog-content\") pod \"certified-operators-xk8wk\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:17 crc kubenswrapper[4721]: I0202 14:18:17.869114 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glqf6\" (UniqueName: \"kubernetes.io/projected/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-kube-api-access-glqf6\") pod \"certified-operators-xk8wk\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:18 crc kubenswrapper[4721]: I0202 14:18:18.021790 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:18 crc kubenswrapper[4721]: I0202 14:18:18.586543 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xk8wk"] Feb 02 14:18:18 crc kubenswrapper[4721]: W0202 14:18:18.596560 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3b23ea8_3d9e_4f03_8331_6ec86d0f4760.slice/crio-1f983f1126763143728d005de533645cffe1911ac60549421dcb0c2cdc87a5dc WatchSource:0}: Error finding container 1f983f1126763143728d005de533645cffe1911ac60549421dcb0c2cdc87a5dc: Status 404 returned error can't find the container with id 1f983f1126763143728d005de533645cffe1911ac60549421dcb0c2cdc87a5dc Feb 02 14:18:19 crc kubenswrapper[4721]: I0202 14:18:19.330966 4721 generic.go:334] "Generic (PLEG): container finished" podID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerID="8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b" exitCode=0 Feb 02 14:18:19 crc kubenswrapper[4721]: I0202 14:18:19.331031 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk8wk" event={"ID":"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760","Type":"ContainerDied","Data":"8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b"} Feb 02 14:18:19 crc kubenswrapper[4721]: I0202 14:18:19.331389 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk8wk" event={"ID":"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760","Type":"ContainerStarted","Data":"1f983f1126763143728d005de533645cffe1911ac60549421dcb0c2cdc87a5dc"} Feb 02 14:18:19 crc kubenswrapper[4721]: I0202 14:18:19.334111 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 14:18:21 crc kubenswrapper[4721]: I0202 14:18:21.351781 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk8wk" event={"ID":"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760","Type":"ContainerStarted","Data":"e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb"} Feb 02 14:18:21 crc kubenswrapper[4721]: I0202 14:18:21.409790 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:18:21 crc kubenswrapper[4721]: E0202 14:18:21.410142 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:18:22 crc kubenswrapper[4721]: I0202 14:18:22.384611 4721 generic.go:334] "Generic (PLEG): container finished" podID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerID="e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb" exitCode=0 Feb 02 14:18:22 crc kubenswrapper[4721]: I0202 14:18:22.384961 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk8wk" event={"ID":"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760","Type":"ContainerDied","Data":"e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb"} Feb 02 14:18:23 crc kubenswrapper[4721]: I0202 14:18:23.397759 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk8wk" event={"ID":"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760","Type":"ContainerStarted","Data":"a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3"} Feb 02 14:18:23 crc kubenswrapper[4721]: I0202 14:18:23.428394 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xk8wk" podStartSLOduration=2.8785531559999997 podStartE2EDuration="6.428373032s" podCreationTimestamp="2026-02-02 14:18:17 +0000 UTC" firstStartedPulling="2026-02-02 14:18:19.333891049 +0000 UTC m=+4639.636405438" lastFinishedPulling="2026-02-02 14:18:22.883710925 +0000 UTC m=+4643.186225314" observedRunningTime="2026-02-02 14:18:23.419673327 +0000 UTC m=+4643.722187716" watchObservedRunningTime="2026-02-02 14:18:23.428373032 +0000 UTC m=+4643.730887421" Feb 02 14:18:28 crc kubenswrapper[4721]: I0202 14:18:28.022434 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:28 crc kubenswrapper[4721]: I0202 14:18:28.024177 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:28 crc kubenswrapper[4721]: I0202 14:18:28.101744 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:28 crc kubenswrapper[4721]: I0202 14:18:28.529431 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:28 crc kubenswrapper[4721]: I0202 14:18:28.582923 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xk8wk"] Feb 02 14:18:30 crc kubenswrapper[4721]: I0202 14:18:30.491706 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xk8wk" podUID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerName="registry-server" containerID="cri-o://a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3" gracePeriod=2 Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.137758 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.172545 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-catalog-content\") pod \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.176795 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glqf6\" (UniqueName: \"kubernetes.io/projected/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-kube-api-access-glqf6\") pod \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.177006 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-utilities\") pod \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\" (UID: \"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760\") " Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.180053 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-utilities" (OuterVolumeSpecName: "utilities") pod "b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" (UID: "b3b23ea8-3d9e-4f03-8331-6ec86d0f4760"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.197063 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-kube-api-access-glqf6" (OuterVolumeSpecName: "kube-api-access-glqf6") pod "b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" (UID: "b3b23ea8-3d9e-4f03-8331-6ec86d0f4760"). InnerVolumeSpecName "kube-api-access-glqf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.284008 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.284513 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glqf6\" (UniqueName: \"kubernetes.io/projected/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-kube-api-access-glqf6\") on node \"crc\" DevicePath \"\"" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.460359 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" (UID: "b3b23ea8-3d9e-4f03-8331-6ec86d0f4760"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.488499 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.505145 4721 generic.go:334] "Generic (PLEG): container finished" podID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerID="a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3" exitCode=0 Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.505190 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk8wk" event={"ID":"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760","Type":"ContainerDied","Data":"a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3"} Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.505214 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xk8wk" event={"ID":"b3b23ea8-3d9e-4f03-8331-6ec86d0f4760","Type":"ContainerDied","Data":"1f983f1126763143728d005de533645cffe1911ac60549421dcb0c2cdc87a5dc"} Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.505231 4721 scope.go:117] "RemoveContainer" containerID="a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.505245 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xk8wk" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.535650 4721 scope.go:117] "RemoveContainer" containerID="e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.555631 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xk8wk"] Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.570734 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xk8wk"] Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.572332 4721 scope.go:117] "RemoveContainer" containerID="8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.607031 4721 scope.go:117] "RemoveContainer" containerID="a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3" Feb 02 14:18:31 crc kubenswrapper[4721]: E0202 14:18:31.607402 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3\": container with ID starting with a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3 not found: ID does not exist" containerID="a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.607440 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3"} err="failed to get container status \"a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3\": rpc error: code = NotFound desc = could not find container \"a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3\": container with ID starting with a51068e3fef3b36eef11f28b1683a109165a966c637a81017f0707d2a0c459a3 not found: ID does not exist" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.607463 4721 scope.go:117] "RemoveContainer" containerID="e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb" Feb 02 14:18:31 crc kubenswrapper[4721]: E0202 14:18:31.607945 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb\": container with ID starting with e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb not found: ID does not exist" containerID="e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.607968 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb"} err="failed to get container status \"e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb\": rpc error: code = NotFound desc = could not find container \"e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb\": container with ID starting with e9851369fe87abdfd3439cdba5e64ae4259c0ee55e6e212a5674a12f4f7f04eb not found: ID does not exist" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.607981 4721 scope.go:117] "RemoveContainer" containerID="8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b" Feb 02 14:18:31 crc kubenswrapper[4721]: E0202 14:18:31.608343 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b\": container with ID starting with 8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b not found: ID does not exist" containerID="8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.608366 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b"} err="failed to get container status \"8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b\": rpc error: code = NotFound desc = could not find container \"8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b\": container with ID starting with 8e8e88c6705f054ed85bbd287815dbf15b6f91151211a88315ab51ece516163b not found: ID does not exist" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.768299 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wmglb"] Feb 02 14:18:31 crc kubenswrapper[4721]: E0202 14:18:31.768992 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerName="extract-content" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.769011 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerName="extract-content" Feb 02 14:18:31 crc kubenswrapper[4721]: E0202 14:18:31.769045 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerName="registry-server" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.769053 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerName="registry-server" Feb 02 14:18:31 crc kubenswrapper[4721]: E0202 14:18:31.769086 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerName="extract-utilities" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.769096 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerName="extract-utilities" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.769413 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" containerName="registry-server" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.771701 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.796110 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2qwd\" (UniqueName: \"kubernetes.io/projected/263f27d2-5f52-41cf-9ff9-62bd2b195df4-kube-api-access-z2qwd\") pod \"redhat-operators-wmglb\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.796219 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-utilities\") pod \"redhat-operators-wmglb\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.796386 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-catalog-content\") pod \"redhat-operators-wmglb\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.802122 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wmglb"] Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.899296 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-catalog-content\") pod \"redhat-operators-wmglb\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.899777 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-catalog-content\") pod \"redhat-operators-wmglb\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.899788 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2qwd\" (UniqueName: \"kubernetes.io/projected/263f27d2-5f52-41cf-9ff9-62bd2b195df4-kube-api-access-z2qwd\") pod \"redhat-operators-wmglb\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.899863 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-utilities\") pod \"redhat-operators-wmglb\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.900169 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-utilities\") pod \"redhat-operators-wmglb\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:31 crc kubenswrapper[4721]: I0202 14:18:31.932179 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2qwd\" (UniqueName: \"kubernetes.io/projected/263f27d2-5f52-41cf-9ff9-62bd2b195df4-kube-api-access-z2qwd\") pod \"redhat-operators-wmglb\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:32 crc kubenswrapper[4721]: I0202 14:18:32.105030 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:32 crc kubenswrapper[4721]: I0202 14:18:32.426880 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3b23ea8-3d9e-4f03-8331-6ec86d0f4760" path="/var/lib/kubelet/pods/b3b23ea8-3d9e-4f03-8331-6ec86d0f4760/volumes" Feb 02 14:18:32 crc kubenswrapper[4721]: I0202 14:18:32.620834 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wmglb"] Feb 02 14:18:33 crc kubenswrapper[4721]: I0202 14:18:33.410577 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:18:33 crc kubenswrapper[4721]: E0202 14:18:33.411457 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:18:33 crc kubenswrapper[4721]: I0202 14:18:33.538920 4721 generic.go:334] "Generic (PLEG): container finished" podID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerID="34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6" exitCode=0 Feb 02 14:18:33 crc kubenswrapper[4721]: I0202 14:18:33.538986 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmglb" event={"ID":"263f27d2-5f52-41cf-9ff9-62bd2b195df4","Type":"ContainerDied","Data":"34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6"} Feb 02 14:18:33 crc kubenswrapper[4721]: I0202 14:18:33.539020 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmglb" event={"ID":"263f27d2-5f52-41cf-9ff9-62bd2b195df4","Type":"ContainerStarted","Data":"c596e828defbc279db6bc8099dcdc050b683b9191c7efd81b06007f36b53c227"} Feb 02 14:18:35 crc kubenswrapper[4721]: I0202 14:18:35.560582 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmglb" event={"ID":"263f27d2-5f52-41cf-9ff9-62bd2b195df4","Type":"ContainerStarted","Data":"32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7"} Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.040028 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m8lnf"] Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.043301 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.050815 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8lnf"] Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.056216 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8t7m\" (UniqueName: \"kubernetes.io/projected/8bdddc4b-0476-4729-9d40-838e53a75e9f-kube-api-access-v8t7m\") pod \"community-operators-m8lnf\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.056287 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-catalog-content\") pod \"community-operators-m8lnf\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.056328 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-utilities\") pod \"community-operators-m8lnf\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.170116 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8t7m\" (UniqueName: \"kubernetes.io/projected/8bdddc4b-0476-4729-9d40-838e53a75e9f-kube-api-access-v8t7m\") pod \"community-operators-m8lnf\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.171271 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-catalog-content\") pod \"community-operators-m8lnf\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.170289 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-catalog-content\") pod \"community-operators-m8lnf\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.175248 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-utilities\") pod \"community-operators-m8lnf\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.175843 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-utilities\") pod \"community-operators-m8lnf\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.191019 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8t7m\" (UniqueName: \"kubernetes.io/projected/8bdddc4b-0476-4729-9d40-838e53a75e9f-kube-api-access-v8t7m\") pod \"community-operators-m8lnf\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:38 crc kubenswrapper[4721]: I0202 14:18:38.370938 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:39 crc kubenswrapper[4721]: I0202 14:18:39.011983 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8lnf"] Feb 02 14:18:39 crc kubenswrapper[4721]: I0202 14:18:39.600620 4721 generic.go:334] "Generic (PLEG): container finished" podID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerID="71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b" exitCode=0 Feb 02 14:18:39 crc kubenswrapper[4721]: I0202 14:18:39.600763 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8lnf" event={"ID":"8bdddc4b-0476-4729-9d40-838e53a75e9f","Type":"ContainerDied","Data":"71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b"} Feb 02 14:18:39 crc kubenswrapper[4721]: I0202 14:18:39.600913 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8lnf" event={"ID":"8bdddc4b-0476-4729-9d40-838e53a75e9f","Type":"ContainerStarted","Data":"3f21840e861b7563f2da3688dda64fc46f9019980ecde86841aff67af7b87fa8"} Feb 02 14:18:40 crc kubenswrapper[4721]: I0202 14:18:40.613723 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8lnf" event={"ID":"8bdddc4b-0476-4729-9d40-838e53a75e9f","Type":"ContainerStarted","Data":"a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95"} Feb 02 14:18:40 crc kubenswrapper[4721]: I0202 14:18:40.617274 4721 generic.go:334] "Generic (PLEG): container finished" podID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerID="32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7" exitCode=0 Feb 02 14:18:40 crc kubenswrapper[4721]: I0202 14:18:40.617317 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmglb" event={"ID":"263f27d2-5f52-41cf-9ff9-62bd2b195df4","Type":"ContainerDied","Data":"32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7"} Feb 02 14:18:41 crc kubenswrapper[4721]: I0202 14:18:41.628892 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmglb" event={"ID":"263f27d2-5f52-41cf-9ff9-62bd2b195df4","Type":"ContainerStarted","Data":"5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b"} Feb 02 14:18:41 crc kubenswrapper[4721]: I0202 14:18:41.657444 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wmglb" podStartSLOduration=2.924546854 podStartE2EDuration="10.657424225s" podCreationTimestamp="2026-02-02 14:18:31 +0000 UTC" firstStartedPulling="2026-02-02 14:18:33.541965903 +0000 UTC m=+4653.844480302" lastFinishedPulling="2026-02-02 14:18:41.274843284 +0000 UTC m=+4661.577357673" observedRunningTime="2026-02-02 14:18:41.64838612 +0000 UTC m=+4661.950900519" watchObservedRunningTime="2026-02-02 14:18:41.657424225 +0000 UTC m=+4661.959938614" Feb 02 14:18:42 crc kubenswrapper[4721]: I0202 14:18:42.105703 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:42 crc kubenswrapper[4721]: I0202 14:18:42.105768 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:18:42 crc kubenswrapper[4721]: I0202 14:18:42.644513 4721 generic.go:334] "Generic (PLEG): container finished" podID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerID="a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95" exitCode=0 Feb 02 14:18:42 crc kubenswrapper[4721]: I0202 14:18:42.644591 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8lnf" event={"ID":"8bdddc4b-0476-4729-9d40-838e53a75e9f","Type":"ContainerDied","Data":"a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95"} Feb 02 14:18:43 crc kubenswrapper[4721]: I0202 14:18:43.241450 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wmglb" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="registry-server" probeResult="failure" output=< Feb 02 14:18:43 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:18:43 crc kubenswrapper[4721]: > Feb 02 14:18:43 crc kubenswrapper[4721]: I0202 14:18:43.655461 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8lnf" event={"ID":"8bdddc4b-0476-4729-9d40-838e53a75e9f","Type":"ContainerStarted","Data":"18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074"} Feb 02 14:18:43 crc kubenswrapper[4721]: I0202 14:18:43.682197 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m8lnf" podStartSLOduration=3.228348989 podStartE2EDuration="6.682182444s" podCreationTimestamp="2026-02-02 14:18:37 +0000 UTC" firstStartedPulling="2026-02-02 14:18:39.603501838 +0000 UTC m=+4659.906016227" lastFinishedPulling="2026-02-02 14:18:43.057335293 +0000 UTC m=+4663.359849682" observedRunningTime="2026-02-02 14:18:43.679895602 +0000 UTC m=+4663.982409991" watchObservedRunningTime="2026-02-02 14:18:43.682182444 +0000 UTC m=+4663.984696833" Feb 02 14:18:44 crc kubenswrapper[4721]: I0202 14:18:44.409903 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:18:44 crc kubenswrapper[4721]: E0202 14:18:44.410265 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:18:48 crc kubenswrapper[4721]: I0202 14:18:48.371320 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:48 crc kubenswrapper[4721]: I0202 14:18:48.372849 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:48 crc kubenswrapper[4721]: I0202 14:18:48.920770 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:48 crc kubenswrapper[4721]: I0202 14:18:48.971659 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:49 crc kubenswrapper[4721]: I0202 14:18:49.160955 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8lnf"] Feb 02 14:18:50 crc kubenswrapper[4721]: I0202 14:18:50.715301 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m8lnf" podUID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerName="registry-server" containerID="cri-o://18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074" gracePeriod=2 Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.242292 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.344358 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8t7m\" (UniqueName: \"kubernetes.io/projected/8bdddc4b-0476-4729-9d40-838e53a75e9f-kube-api-access-v8t7m\") pod \"8bdddc4b-0476-4729-9d40-838e53a75e9f\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.344530 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-utilities\") pod \"8bdddc4b-0476-4729-9d40-838e53a75e9f\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.344679 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-catalog-content\") pod \"8bdddc4b-0476-4729-9d40-838e53a75e9f\" (UID: \"8bdddc4b-0476-4729-9d40-838e53a75e9f\") " Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.345681 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-utilities" (OuterVolumeSpecName: "utilities") pod "8bdddc4b-0476-4729-9d40-838e53a75e9f" (UID: "8bdddc4b-0476-4729-9d40-838e53a75e9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.350006 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bdddc4b-0476-4729-9d40-838e53a75e9f-kube-api-access-v8t7m" (OuterVolumeSpecName: "kube-api-access-v8t7m") pod "8bdddc4b-0476-4729-9d40-838e53a75e9f" (UID: "8bdddc4b-0476-4729-9d40-838e53a75e9f"). InnerVolumeSpecName "kube-api-access-v8t7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.398353 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bdddc4b-0476-4729-9d40-838e53a75e9f" (UID: "8bdddc4b-0476-4729-9d40-838e53a75e9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.452150 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.452177 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8t7m\" (UniqueName: \"kubernetes.io/projected/8bdddc4b-0476-4729-9d40-838e53a75e9f-kube-api-access-v8t7m\") on node \"crc\" DevicePath \"\"" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.452190 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdddc4b-0476-4729-9d40-838e53a75e9f-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.730655 4721 generic.go:334] "Generic (PLEG): container finished" podID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerID="18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074" exitCode=0 Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.731465 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8lnf" event={"ID":"8bdddc4b-0476-4729-9d40-838e53a75e9f","Type":"ContainerDied","Data":"18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074"} Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.731511 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8lnf" event={"ID":"8bdddc4b-0476-4729-9d40-838e53a75e9f","Type":"ContainerDied","Data":"3f21840e861b7563f2da3688dda64fc46f9019980ecde86841aff67af7b87fa8"} Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.731531 4721 scope.go:117] "RemoveContainer" containerID="18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.731688 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8lnf" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.762235 4721 scope.go:117] "RemoveContainer" containerID="a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.772875 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8lnf"] Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.784097 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m8lnf"] Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.791468 4721 scope.go:117] "RemoveContainer" containerID="71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.846563 4721 scope.go:117] "RemoveContainer" containerID="18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074" Feb 02 14:18:51 crc kubenswrapper[4721]: E0202 14:18:51.847244 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074\": container with ID starting with 18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074 not found: ID does not exist" containerID="18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.847330 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074"} err="failed to get container status \"18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074\": rpc error: code = NotFound desc = could not find container \"18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074\": container with ID starting with 18f97b0b44f86ef13cc59abe3aab035bf2f3370b9f54c3d9fa9d566914b83074 not found: ID does not exist" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.847382 4721 scope.go:117] "RemoveContainer" containerID="a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95" Feb 02 14:18:51 crc kubenswrapper[4721]: E0202 14:18:51.847796 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95\": container with ID starting with a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95 not found: ID does not exist" containerID="a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.847829 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95"} err="failed to get container status \"a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95\": rpc error: code = NotFound desc = could not find container \"a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95\": container with ID starting with a4cabbd58b03ffc78d865bfc47dd67e205be5cb898bd874018920d079acaae95 not found: ID does not exist" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.847857 4721 scope.go:117] "RemoveContainer" containerID="71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b" Feb 02 14:18:51 crc kubenswrapper[4721]: E0202 14:18:51.848540 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b\": container with ID starting with 71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b not found: ID does not exist" containerID="71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b" Feb 02 14:18:51 crc kubenswrapper[4721]: I0202 14:18:51.848569 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b"} err="failed to get container status \"71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b\": rpc error: code = NotFound desc = could not find container \"71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b\": container with ID starting with 71d9066b6279724babc25c954a9a5748114dfc922de8badfc942f10b62352c3b not found: ID does not exist" Feb 02 14:18:52 crc kubenswrapper[4721]: I0202 14:18:52.425436 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bdddc4b-0476-4729-9d40-838e53a75e9f" path="/var/lib/kubelet/pods/8bdddc4b-0476-4729-9d40-838e53a75e9f/volumes" Feb 02 14:18:53 crc kubenswrapper[4721]: I0202 14:18:53.153321 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wmglb" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="registry-server" probeResult="failure" output=< Feb 02 14:18:53 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:18:53 crc kubenswrapper[4721]: > Feb 02 14:18:57 crc kubenswrapper[4721]: I0202 14:18:57.409678 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:18:57 crc kubenswrapper[4721]: E0202 14:18:57.410304 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:19:03 crc kubenswrapper[4721]: I0202 14:19:03.172894 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wmglb" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="registry-server" probeResult="failure" output=< Feb 02 14:19:03 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:19:03 crc kubenswrapper[4721]: > Feb 02 14:19:08 crc kubenswrapper[4721]: I0202 14:19:08.410339 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:19:08 crc kubenswrapper[4721]: E0202 14:19:08.411476 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:19:13 crc kubenswrapper[4721]: I0202 14:19:13.151373 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wmglb" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="registry-server" probeResult="failure" output=< Feb 02 14:19:13 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:19:13 crc kubenswrapper[4721]: > Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.433489 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hrx6w"] Feb 02 14:19:18 crc kubenswrapper[4721]: E0202 14:19:18.434787 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerName="registry-server" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.434806 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerName="registry-server" Feb 02 14:19:18 crc kubenswrapper[4721]: E0202 14:19:18.434836 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerName="extract-content" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.434845 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerName="extract-content" Feb 02 14:19:18 crc kubenswrapper[4721]: E0202 14:19:18.434892 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerName="extract-utilities" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.434902 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerName="extract-utilities" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.435228 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bdddc4b-0476-4729-9d40-838e53a75e9f" containerName="registry-server" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.437450 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.447471 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrx6w"] Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.556474 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blxbm\" (UniqueName: \"kubernetes.io/projected/3005aa34-5adf-43d0-90b8-82f91624d082-kube-api-access-blxbm\") pod \"redhat-marketplace-hrx6w\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.556688 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-catalog-content\") pod \"redhat-marketplace-hrx6w\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.556834 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-utilities\") pod \"redhat-marketplace-hrx6w\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.659395 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-catalog-content\") pod \"redhat-marketplace-hrx6w\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.659452 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-utilities\") pod \"redhat-marketplace-hrx6w\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.659585 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blxbm\" (UniqueName: \"kubernetes.io/projected/3005aa34-5adf-43d0-90b8-82f91624d082-kube-api-access-blxbm\") pod \"redhat-marketplace-hrx6w\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.660027 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-catalog-content\") pod \"redhat-marketplace-hrx6w\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.660240 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-utilities\") pod \"redhat-marketplace-hrx6w\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.680959 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blxbm\" (UniqueName: \"kubernetes.io/projected/3005aa34-5adf-43d0-90b8-82f91624d082-kube-api-access-blxbm\") pod \"redhat-marketplace-hrx6w\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:18 crc kubenswrapper[4721]: I0202 14:19:18.761967 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:19 crc kubenswrapper[4721]: I0202 14:19:19.338502 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrx6w"] Feb 02 14:19:20 crc kubenswrapper[4721]: I0202 14:19:20.112412 4721 generic.go:334] "Generic (PLEG): container finished" podID="3005aa34-5adf-43d0-90b8-82f91624d082" containerID="ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be" exitCode=0 Feb 02 14:19:20 crc kubenswrapper[4721]: I0202 14:19:20.112448 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrx6w" event={"ID":"3005aa34-5adf-43d0-90b8-82f91624d082","Type":"ContainerDied","Data":"ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be"} Feb 02 14:19:20 crc kubenswrapper[4721]: I0202 14:19:20.112473 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrx6w" event={"ID":"3005aa34-5adf-43d0-90b8-82f91624d082","Type":"ContainerStarted","Data":"c90699de105f06a3330b3309c058512dd68a76e320f7810c34d0e844de217906"} Feb 02 14:19:21 crc kubenswrapper[4721]: I0202 14:19:21.130801 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrx6w" event={"ID":"3005aa34-5adf-43d0-90b8-82f91624d082","Type":"ContainerStarted","Data":"b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2"} Feb 02 14:19:22 crc kubenswrapper[4721]: I0202 14:19:22.155032 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:19:22 crc kubenswrapper[4721]: I0202 14:19:22.211309 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:19:22 crc kubenswrapper[4721]: I0202 14:19:22.410044 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:19:22 crc kubenswrapper[4721]: E0202 14:19:22.410669 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:19:23 crc kubenswrapper[4721]: I0202 14:19:23.156544 4721 generic.go:334] "Generic (PLEG): container finished" podID="3005aa34-5adf-43d0-90b8-82f91624d082" containerID="b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2" exitCode=0 Feb 02 14:19:23 crc kubenswrapper[4721]: I0202 14:19:23.156626 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrx6w" event={"ID":"3005aa34-5adf-43d0-90b8-82f91624d082","Type":"ContainerDied","Data":"b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2"} Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.175807 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrx6w" event={"ID":"3005aa34-5adf-43d0-90b8-82f91624d082","Type":"ContainerStarted","Data":"16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87"} Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.202095 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wmglb"] Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.202409 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wmglb" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="registry-server" containerID="cri-o://5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b" gracePeriod=2 Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.222537 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hrx6w" podStartSLOduration=2.7550519700000002 podStartE2EDuration="6.222515063s" podCreationTimestamp="2026-02-02 14:19:18 +0000 UTC" firstStartedPulling="2026-02-02 14:19:20.115149673 +0000 UTC m=+4700.417664102" lastFinishedPulling="2026-02-02 14:19:23.582612806 +0000 UTC m=+4703.885127195" observedRunningTime="2026-02-02 14:19:24.221193158 +0000 UTC m=+4704.523707557" watchObservedRunningTime="2026-02-02 14:19:24.222515063 +0000 UTC m=+4704.525029472" Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.804279 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.840949 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2qwd\" (UniqueName: \"kubernetes.io/projected/263f27d2-5f52-41cf-9ff9-62bd2b195df4-kube-api-access-z2qwd\") pod \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.841130 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-utilities\") pod \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.841178 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-catalog-content\") pod \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\" (UID: \"263f27d2-5f52-41cf-9ff9-62bd2b195df4\") " Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.843453 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-utilities" (OuterVolumeSpecName: "utilities") pod "263f27d2-5f52-41cf-9ff9-62bd2b195df4" (UID: "263f27d2-5f52-41cf-9ff9-62bd2b195df4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.851036 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263f27d2-5f52-41cf-9ff9-62bd2b195df4-kube-api-access-z2qwd" (OuterVolumeSpecName: "kube-api-access-z2qwd") pod "263f27d2-5f52-41cf-9ff9-62bd2b195df4" (UID: "263f27d2-5f52-41cf-9ff9-62bd2b195df4"). InnerVolumeSpecName "kube-api-access-z2qwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.943723 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2qwd\" (UniqueName: \"kubernetes.io/projected/263f27d2-5f52-41cf-9ff9-62bd2b195df4-kube-api-access-z2qwd\") on node \"crc\" DevicePath \"\"" Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.943758 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:19:24 crc kubenswrapper[4721]: I0202 14:19:24.968416 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "263f27d2-5f52-41cf-9ff9-62bd2b195df4" (UID: "263f27d2-5f52-41cf-9ff9-62bd2b195df4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.046887 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/263f27d2-5f52-41cf-9ff9-62bd2b195df4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.189937 4721 generic.go:334] "Generic (PLEG): container finished" podID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerID="5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b" exitCode=0 Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.189993 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wmglb" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.189983 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmglb" event={"ID":"263f27d2-5f52-41cf-9ff9-62bd2b195df4","Type":"ContainerDied","Data":"5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b"} Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.190381 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wmglb" event={"ID":"263f27d2-5f52-41cf-9ff9-62bd2b195df4","Type":"ContainerDied","Data":"c596e828defbc279db6bc8099dcdc050b683b9191c7efd81b06007f36b53c227"} Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.190407 4721 scope.go:117] "RemoveContainer" containerID="5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.228773 4721 scope.go:117] "RemoveContainer" containerID="32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.235613 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wmglb"] Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.255524 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wmglb"] Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.265507 4721 scope.go:117] "RemoveContainer" containerID="34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6" Feb 02 14:19:25 crc kubenswrapper[4721]: E0202 14:19:25.321542 4721 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod263f27d2_5f52_41cf_9ff9_62bd2b195df4.slice/crio-c596e828defbc279db6bc8099dcdc050b683b9191c7efd81b06007f36b53c227\": RecentStats: unable to find data in memory cache]" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.328722 4721 scope.go:117] "RemoveContainer" containerID="5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b" Feb 02 14:19:25 crc kubenswrapper[4721]: E0202 14:19:25.329201 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b\": container with ID starting with 5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b not found: ID does not exist" containerID="5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.329256 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b"} err="failed to get container status \"5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b\": rpc error: code = NotFound desc = could not find container \"5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b\": container with ID starting with 5bd7359f565ae41c756b8b3ce5bf851cdd66a9eb0ea910770311cd6e1d283e4b not found: ID does not exist" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.329278 4721 scope.go:117] "RemoveContainer" containerID="32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7" Feb 02 14:19:25 crc kubenswrapper[4721]: E0202 14:19:25.329673 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7\": container with ID starting with 32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7 not found: ID does not exist" containerID="32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.329699 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7"} err="failed to get container status \"32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7\": rpc error: code = NotFound desc = could not find container \"32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7\": container with ID starting with 32274dd2adabed2a33935a26e58c08e0355e6b861fe39345b34b7d33608acbb7 not found: ID does not exist" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.329717 4721 scope.go:117] "RemoveContainer" containerID="34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6" Feb 02 14:19:25 crc kubenswrapper[4721]: E0202 14:19:25.329957 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6\": container with ID starting with 34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6 not found: ID does not exist" containerID="34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6" Feb 02 14:19:25 crc kubenswrapper[4721]: I0202 14:19:25.329971 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6"} err="failed to get container status \"34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6\": rpc error: code = NotFound desc = could not find container \"34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6\": container with ID starting with 34d8e37ac136c048e14db0e26d0ed848b873e987fc398b35a174771fee1eaac6 not found: ID does not exist" Feb 02 14:19:26 crc kubenswrapper[4721]: I0202 14:19:26.433329 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" path="/var/lib/kubelet/pods/263f27d2-5f52-41cf-9ff9-62bd2b195df4/volumes" Feb 02 14:19:28 crc kubenswrapper[4721]: I0202 14:19:28.762755 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:28 crc kubenswrapper[4721]: I0202 14:19:28.764184 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:28 crc kubenswrapper[4721]: I0202 14:19:28.809778 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:29 crc kubenswrapper[4721]: I0202 14:19:29.278802 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:29 crc kubenswrapper[4721]: I0202 14:19:29.991684 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrx6w"] Feb 02 14:19:31 crc kubenswrapper[4721]: I0202 14:19:31.253334 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hrx6w" podUID="3005aa34-5adf-43d0-90b8-82f91624d082" containerName="registry-server" containerID="cri-o://16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87" gracePeriod=2 Feb 02 14:19:31 crc kubenswrapper[4721]: I0202 14:19:31.880996 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:31 crc kubenswrapper[4721]: I0202 14:19:31.916113 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-utilities\") pod \"3005aa34-5adf-43d0-90b8-82f91624d082\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " Feb 02 14:19:31 crc kubenswrapper[4721]: I0202 14:19:31.917013 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-utilities" (OuterVolumeSpecName: "utilities") pod "3005aa34-5adf-43d0-90b8-82f91624d082" (UID: "3005aa34-5adf-43d0-90b8-82f91624d082"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:19:31 crc kubenswrapper[4721]: I0202 14:19:31.917178 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-catalog-content\") pod \"3005aa34-5adf-43d0-90b8-82f91624d082\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " Feb 02 14:19:31 crc kubenswrapper[4721]: I0202 14:19:31.917283 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blxbm\" (UniqueName: \"kubernetes.io/projected/3005aa34-5adf-43d0-90b8-82f91624d082-kube-api-access-blxbm\") pod \"3005aa34-5adf-43d0-90b8-82f91624d082\" (UID: \"3005aa34-5adf-43d0-90b8-82f91624d082\") " Feb 02 14:19:31 crc kubenswrapper[4721]: I0202 14:19:31.918779 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:19:31 crc kubenswrapper[4721]: I0202 14:19:31.924418 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3005aa34-5adf-43d0-90b8-82f91624d082-kube-api-access-blxbm" (OuterVolumeSpecName: "kube-api-access-blxbm") pod "3005aa34-5adf-43d0-90b8-82f91624d082" (UID: "3005aa34-5adf-43d0-90b8-82f91624d082"). InnerVolumeSpecName "kube-api-access-blxbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:19:31 crc kubenswrapper[4721]: I0202 14:19:31.940693 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3005aa34-5adf-43d0-90b8-82f91624d082" (UID: "3005aa34-5adf-43d0-90b8-82f91624d082"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.021199 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3005aa34-5adf-43d0-90b8-82f91624d082-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.021231 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blxbm\" (UniqueName: \"kubernetes.io/projected/3005aa34-5adf-43d0-90b8-82f91624d082-kube-api-access-blxbm\") on node \"crc\" DevicePath \"\"" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.262431 4721 generic.go:334] "Generic (PLEG): container finished" podID="3005aa34-5adf-43d0-90b8-82f91624d082" containerID="16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87" exitCode=0 Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.262468 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrx6w" event={"ID":"3005aa34-5adf-43d0-90b8-82f91624d082","Type":"ContainerDied","Data":"16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87"} Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.262494 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrx6w" event={"ID":"3005aa34-5adf-43d0-90b8-82f91624d082","Type":"ContainerDied","Data":"c90699de105f06a3330b3309c058512dd68a76e320f7810c34d0e844de217906"} Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.262509 4721 scope.go:117] "RemoveContainer" containerID="16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.262824 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrx6w" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.293720 4721 scope.go:117] "RemoveContainer" containerID="b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.299766 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrx6w"] Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.308556 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrx6w"] Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.328323 4721 scope.go:117] "RemoveContainer" containerID="ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.377653 4721 scope.go:117] "RemoveContainer" containerID="16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87" Feb 02 14:19:32 crc kubenswrapper[4721]: E0202 14:19:32.378203 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87\": container with ID starting with 16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87 not found: ID does not exist" containerID="16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.378233 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87"} err="failed to get container status \"16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87\": rpc error: code = NotFound desc = could not find container \"16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87\": container with ID starting with 16cf2adabf3bb01467ce25195f0e1b9aafc8eb968866b3449ec06bbbe287ee87 not found: ID does not exist" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.378253 4721 scope.go:117] "RemoveContainer" containerID="b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2" Feb 02 14:19:32 crc kubenswrapper[4721]: E0202 14:19:32.378564 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2\": container with ID starting with b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2 not found: ID does not exist" containerID="b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.378625 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2"} err="failed to get container status \"b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2\": rpc error: code = NotFound desc = could not find container \"b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2\": container with ID starting with b7c5d3e0cc3de24c3af77da64cc04f012d62a7c8cb308e3619dc72c6677d70d2 not found: ID does not exist" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.378664 4721 scope.go:117] "RemoveContainer" containerID="ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be" Feb 02 14:19:32 crc kubenswrapper[4721]: E0202 14:19:32.378988 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be\": container with ID starting with ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be not found: ID does not exist" containerID="ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.379014 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be"} err="failed to get container status \"ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be\": rpc error: code = NotFound desc = could not find container \"ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be\": container with ID starting with ad82f58e137f8b92244c3701236f7ce0785115f9c6d9bd8ea8baa43e82f9a1be not found: ID does not exist" Feb 02 14:19:32 crc kubenswrapper[4721]: I0202 14:19:32.423276 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3005aa34-5adf-43d0-90b8-82f91624d082" path="/var/lib/kubelet/pods/3005aa34-5adf-43d0-90b8-82f91624d082/volumes" Feb 02 14:19:34 crc kubenswrapper[4721]: I0202 14:19:34.410531 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:19:34 crc kubenswrapper[4721]: E0202 14:19:34.410828 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:19:49 crc kubenswrapper[4721]: I0202 14:19:49.410564 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:19:49 crc kubenswrapper[4721]: E0202 14:19:49.411498 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:20:00 crc kubenswrapper[4721]: I0202 14:20:00.422050 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:20:00 crc kubenswrapper[4721]: E0202 14:20:00.423084 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:20:11 crc kubenswrapper[4721]: I0202 14:20:11.410861 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:20:11 crc kubenswrapper[4721]: E0202 14:20:11.411848 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:20:23 crc kubenswrapper[4721]: I0202 14:20:23.410212 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:20:23 crc kubenswrapper[4721]: E0202 14:20:23.411101 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:20:35 crc kubenswrapper[4721]: I0202 14:20:35.410033 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:20:35 crc kubenswrapper[4721]: E0202 14:20:35.411099 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:20:50 crc kubenswrapper[4721]: I0202 14:20:50.429959 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:20:50 crc kubenswrapper[4721]: E0202 14:20:50.431801 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:21:02 crc kubenswrapper[4721]: I0202 14:21:02.410226 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:21:02 crc kubenswrapper[4721]: E0202 14:21:02.411719 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:21:14 crc kubenswrapper[4721]: I0202 14:21:14.410844 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:21:14 crc kubenswrapper[4721]: E0202 14:21:14.412018 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:21:29 crc kubenswrapper[4721]: I0202 14:21:29.410710 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:21:29 crc kubenswrapper[4721]: E0202 14:21:29.413591 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:21:41 crc kubenswrapper[4721]: I0202 14:21:41.411336 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:21:41 crc kubenswrapper[4721]: E0202 14:21:41.412034 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:21:53 crc kubenswrapper[4721]: I0202 14:21:53.410931 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:21:53 crc kubenswrapper[4721]: E0202 14:21:53.412201 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:22:08 crc kubenswrapper[4721]: I0202 14:22:08.410168 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:22:08 crc kubenswrapper[4721]: E0202 14:22:08.411158 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:22:23 crc kubenswrapper[4721]: I0202 14:22:23.410572 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:22:24 crc kubenswrapper[4721]: I0202 14:22:24.111707 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"80a4d97025e59da4be229a82e47246c7035ccedac52df0fa5679fdc8c1c7b8ec"} Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.121877 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2dxv/must-gather-kr4jl"] Feb 02 14:22:28 crc kubenswrapper[4721]: E0202 14:22:28.123193 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3005aa34-5adf-43d0-90b8-82f91624d082" containerName="extract-content" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.123213 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3005aa34-5adf-43d0-90b8-82f91624d082" containerName="extract-content" Feb 02 14:22:28 crc kubenswrapper[4721]: E0202 14:22:28.123241 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="extract-content" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.123250 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="extract-content" Feb 02 14:22:28 crc kubenswrapper[4721]: E0202 14:22:28.123263 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="registry-server" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.123273 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="registry-server" Feb 02 14:22:28 crc kubenswrapper[4721]: E0202 14:22:28.123290 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3005aa34-5adf-43d0-90b8-82f91624d082" containerName="registry-server" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.123298 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3005aa34-5adf-43d0-90b8-82f91624d082" containerName="registry-server" Feb 02 14:22:28 crc kubenswrapper[4721]: E0202 14:22:28.123312 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3005aa34-5adf-43d0-90b8-82f91624d082" containerName="extract-utilities" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.123320 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="3005aa34-5adf-43d0-90b8-82f91624d082" containerName="extract-utilities" Feb 02 14:22:28 crc kubenswrapper[4721]: E0202 14:22:28.123347 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="extract-utilities" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.123355 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="extract-utilities" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.123664 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="3005aa34-5adf-43d0-90b8-82f91624d082" containerName="registry-server" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.123679 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="263f27d2-5f52-41cf-9ff9-62bd2b195df4" containerName="registry-server" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.125824 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.129210 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d2dxv"/"openshift-service-ca.crt" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.129556 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d2dxv"/"default-dockercfg-86gxd" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.134609 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d2dxv"/"kube-root-ca.crt" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.139312 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2dxv/must-gather-kr4jl"] Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.192457 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v576\" (UniqueName: \"kubernetes.io/projected/56b96222-739f-41d2-996e-14e2ee91a139-kube-api-access-9v576\") pod \"must-gather-kr4jl\" (UID: \"56b96222-739f-41d2-996e-14e2ee91a139\") " pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.192909 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/56b96222-739f-41d2-996e-14e2ee91a139-must-gather-output\") pod \"must-gather-kr4jl\" (UID: \"56b96222-739f-41d2-996e-14e2ee91a139\") " pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.295417 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/56b96222-739f-41d2-996e-14e2ee91a139-must-gather-output\") pod \"must-gather-kr4jl\" (UID: \"56b96222-739f-41d2-996e-14e2ee91a139\") " pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.295526 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v576\" (UniqueName: \"kubernetes.io/projected/56b96222-739f-41d2-996e-14e2ee91a139-kube-api-access-9v576\") pod \"must-gather-kr4jl\" (UID: \"56b96222-739f-41d2-996e-14e2ee91a139\") " pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.296055 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/56b96222-739f-41d2-996e-14e2ee91a139-must-gather-output\") pod \"must-gather-kr4jl\" (UID: \"56b96222-739f-41d2-996e-14e2ee91a139\") " pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.325756 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v576\" (UniqueName: \"kubernetes.io/projected/56b96222-739f-41d2-996e-14e2ee91a139-kube-api-access-9v576\") pod \"must-gather-kr4jl\" (UID: \"56b96222-739f-41d2-996e-14e2ee91a139\") " pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.450985 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:22:28 crc kubenswrapper[4721]: I0202 14:22:28.959389 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2dxv/must-gather-kr4jl"] Feb 02 14:22:29 crc kubenswrapper[4721]: I0202 14:22:29.176277 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" event={"ID":"56b96222-739f-41d2-996e-14e2ee91a139","Type":"ContainerStarted","Data":"58e1e916eac833a2da01583a92d764fb1598205670f01f6a944e39aa58fc1c62"} Feb 02 14:22:34 crc kubenswrapper[4721]: I0202 14:22:34.228403 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" event={"ID":"56b96222-739f-41d2-996e-14e2ee91a139","Type":"ContainerStarted","Data":"79dee8f469c998ce3c1779dff4af47dcba83618a5b09996aa6983b70d3d26f3f"} Feb 02 14:22:34 crc kubenswrapper[4721]: I0202 14:22:34.228900 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" event={"ID":"56b96222-739f-41d2-996e-14e2ee91a139","Type":"ContainerStarted","Data":"ee046980151a1ce898253bd863e95503c1103f7eed2587fbfa57c95be0495a2a"} Feb 02 14:22:34 crc kubenswrapper[4721]: I0202 14:22:34.249015 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" podStartSLOduration=1.9911042060000002 podStartE2EDuration="6.248995801s" podCreationTimestamp="2026-02-02 14:22:28 +0000 UTC" firstStartedPulling="2026-02-02 14:22:28.976372389 +0000 UTC m=+4889.278886778" lastFinishedPulling="2026-02-02 14:22:33.234263984 +0000 UTC m=+4893.536778373" observedRunningTime="2026-02-02 14:22:34.242888477 +0000 UTC m=+4894.545402886" watchObservedRunningTime="2026-02-02 14:22:34.248995801 +0000 UTC m=+4894.551510190" Feb 02 14:22:39 crc kubenswrapper[4721]: I0202 14:22:39.153000 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2dxv/crc-debug-nsq7x"] Feb 02 14:22:39 crc kubenswrapper[4721]: I0202 14:22:39.155118 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:22:39 crc kubenswrapper[4721]: I0202 14:22:39.182115 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08510db5-eac2-4005-abe5-4cb8bb7604dc-host\") pod \"crc-debug-nsq7x\" (UID: \"08510db5-eac2-4005-abe5-4cb8bb7604dc\") " pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:22:39 crc kubenswrapper[4721]: I0202 14:22:39.182238 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk22m\" (UniqueName: \"kubernetes.io/projected/08510db5-eac2-4005-abe5-4cb8bb7604dc-kube-api-access-pk22m\") pod \"crc-debug-nsq7x\" (UID: \"08510db5-eac2-4005-abe5-4cb8bb7604dc\") " pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:22:39 crc kubenswrapper[4721]: I0202 14:22:39.284973 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08510db5-eac2-4005-abe5-4cb8bb7604dc-host\") pod \"crc-debug-nsq7x\" (UID: \"08510db5-eac2-4005-abe5-4cb8bb7604dc\") " pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:22:39 crc kubenswrapper[4721]: I0202 14:22:39.285098 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk22m\" (UniqueName: \"kubernetes.io/projected/08510db5-eac2-4005-abe5-4cb8bb7604dc-kube-api-access-pk22m\") pod \"crc-debug-nsq7x\" (UID: \"08510db5-eac2-4005-abe5-4cb8bb7604dc\") " pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:22:39 crc kubenswrapper[4721]: I0202 14:22:39.285128 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08510db5-eac2-4005-abe5-4cb8bb7604dc-host\") pod \"crc-debug-nsq7x\" (UID: \"08510db5-eac2-4005-abe5-4cb8bb7604dc\") " pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:22:39 crc kubenswrapper[4721]: I0202 14:22:39.304337 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk22m\" (UniqueName: \"kubernetes.io/projected/08510db5-eac2-4005-abe5-4cb8bb7604dc-kube-api-access-pk22m\") pod \"crc-debug-nsq7x\" (UID: \"08510db5-eac2-4005-abe5-4cb8bb7604dc\") " pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:22:39 crc kubenswrapper[4721]: I0202 14:22:39.477783 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:22:40 crc kubenswrapper[4721]: I0202 14:22:40.298514 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" event={"ID":"08510db5-eac2-4005-abe5-4cb8bb7604dc","Type":"ContainerStarted","Data":"3a24955b413585c5b93167ef896b4245df4484e81497f3b0e51abc71423c9d5c"} Feb 02 14:22:53 crc kubenswrapper[4721]: I0202 14:22:53.449623 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" event={"ID":"08510db5-eac2-4005-abe5-4cb8bb7604dc","Type":"ContainerStarted","Data":"3d42ccd527c1c49d2ecc79f70ee1abbc38a002f9404fa0303071492300851127"} Feb 02 14:22:53 crc kubenswrapper[4721]: I0202 14:22:53.477137 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" podStartSLOduration=1.518988344 podStartE2EDuration="14.4771186s" podCreationTimestamp="2026-02-02 14:22:39 +0000 UTC" firstStartedPulling="2026-02-02 14:22:39.514970465 +0000 UTC m=+4899.817484854" lastFinishedPulling="2026-02-02 14:22:52.473100721 +0000 UTC m=+4912.775615110" observedRunningTime="2026-02-02 14:22:53.470460021 +0000 UTC m=+4913.772974420" watchObservedRunningTime="2026-02-02 14:22:53.4771186 +0000 UTC m=+4913.779632999" Feb 02 14:23:15 crc kubenswrapper[4721]: I0202 14:23:15.679286 4721 generic.go:334] "Generic (PLEG): container finished" podID="08510db5-eac2-4005-abe5-4cb8bb7604dc" containerID="3d42ccd527c1c49d2ecc79f70ee1abbc38a002f9404fa0303071492300851127" exitCode=0 Feb 02 14:23:15 crc kubenswrapper[4721]: I0202 14:23:15.679359 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" event={"ID":"08510db5-eac2-4005-abe5-4cb8bb7604dc","Type":"ContainerDied","Data":"3d42ccd527c1c49d2ecc79f70ee1abbc38a002f9404fa0303071492300851127"} Feb 02 14:23:16 crc kubenswrapper[4721]: I0202 14:23:16.825875 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:23:16 crc kubenswrapper[4721]: I0202 14:23:16.861826 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2dxv/crc-debug-nsq7x"] Feb 02 14:23:16 crc kubenswrapper[4721]: I0202 14:23:16.874871 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2dxv/crc-debug-nsq7x"] Feb 02 14:23:16 crc kubenswrapper[4721]: I0202 14:23:16.901167 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08510db5-eac2-4005-abe5-4cb8bb7604dc-host\") pod \"08510db5-eac2-4005-abe5-4cb8bb7604dc\" (UID: \"08510db5-eac2-4005-abe5-4cb8bb7604dc\") " Feb 02 14:23:16 crc kubenswrapper[4721]: I0202 14:23:16.901328 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08510db5-eac2-4005-abe5-4cb8bb7604dc-host" (OuterVolumeSpecName: "host") pod "08510db5-eac2-4005-abe5-4cb8bb7604dc" (UID: "08510db5-eac2-4005-abe5-4cb8bb7604dc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 14:23:16 crc kubenswrapper[4721]: I0202 14:23:16.901537 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk22m\" (UniqueName: \"kubernetes.io/projected/08510db5-eac2-4005-abe5-4cb8bb7604dc-kube-api-access-pk22m\") pod \"08510db5-eac2-4005-abe5-4cb8bb7604dc\" (UID: \"08510db5-eac2-4005-abe5-4cb8bb7604dc\") " Feb 02 14:23:16 crc kubenswrapper[4721]: I0202 14:23:16.902382 4721 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08510db5-eac2-4005-abe5-4cb8bb7604dc-host\") on node \"crc\" DevicePath \"\"" Feb 02 14:23:16 crc kubenswrapper[4721]: I0202 14:23:16.909060 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08510db5-eac2-4005-abe5-4cb8bb7604dc-kube-api-access-pk22m" (OuterVolumeSpecName: "kube-api-access-pk22m") pod "08510db5-eac2-4005-abe5-4cb8bb7604dc" (UID: "08510db5-eac2-4005-abe5-4cb8bb7604dc"). InnerVolumeSpecName "kube-api-access-pk22m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:23:17 crc kubenswrapper[4721]: I0202 14:23:17.004919 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk22m\" (UniqueName: \"kubernetes.io/projected/08510db5-eac2-4005-abe5-4cb8bb7604dc-kube-api-access-pk22m\") on node \"crc\" DevicePath \"\"" Feb 02 14:23:17 crc kubenswrapper[4721]: I0202 14:23:17.726004 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a24955b413585c5b93167ef896b4245df4484e81497f3b0e51abc71423c9d5c" Feb 02 14:23:17 crc kubenswrapper[4721]: I0202 14:23:17.726173 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/crc-debug-nsq7x" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.277471 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2dxv/crc-debug-br45z"] Feb 02 14:23:18 crc kubenswrapper[4721]: E0202 14:23:18.278150 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08510db5-eac2-4005-abe5-4cb8bb7604dc" containerName="container-00" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.278162 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="08510db5-eac2-4005-abe5-4cb8bb7604dc" containerName="container-00" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.278392 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="08510db5-eac2-4005-abe5-4cb8bb7604dc" containerName="container-00" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.279255 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.335749 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb746\" (UniqueName: \"kubernetes.io/projected/00ff63e7-883d-4e6d-985b-9122517d5081-kube-api-access-rb746\") pod \"crc-debug-br45z\" (UID: \"00ff63e7-883d-4e6d-985b-9122517d5081\") " pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.335862 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ff63e7-883d-4e6d-985b-9122517d5081-host\") pod \"crc-debug-br45z\" (UID: \"00ff63e7-883d-4e6d-985b-9122517d5081\") " pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.425887 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08510db5-eac2-4005-abe5-4cb8bb7604dc" path="/var/lib/kubelet/pods/08510db5-eac2-4005-abe5-4cb8bb7604dc/volumes" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.437783 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ff63e7-883d-4e6d-985b-9122517d5081-host\") pod \"crc-debug-br45z\" (UID: \"00ff63e7-883d-4e6d-985b-9122517d5081\") " pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.437987 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ff63e7-883d-4e6d-985b-9122517d5081-host\") pod \"crc-debug-br45z\" (UID: \"00ff63e7-883d-4e6d-985b-9122517d5081\") " pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.438015 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb746\" (UniqueName: \"kubernetes.io/projected/00ff63e7-883d-4e6d-985b-9122517d5081-kube-api-access-rb746\") pod \"crc-debug-br45z\" (UID: \"00ff63e7-883d-4e6d-985b-9122517d5081\") " pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.870981 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb746\" (UniqueName: \"kubernetes.io/projected/00ff63e7-883d-4e6d-985b-9122517d5081-kube-api-access-rb746\") pod \"crc-debug-br45z\" (UID: \"00ff63e7-883d-4e6d-985b-9122517d5081\") " pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:18 crc kubenswrapper[4721]: I0202 14:23:18.901497 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:19 crc kubenswrapper[4721]: I0202 14:23:19.746437 4721 generic.go:334] "Generic (PLEG): container finished" podID="00ff63e7-883d-4e6d-985b-9122517d5081" containerID="c5e695876117098c7aa3acb18071dce73b746830410f10fe488c54040af4236a" exitCode=1 Feb 02 14:23:19 crc kubenswrapper[4721]: I0202 14:23:19.746529 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2dxv/crc-debug-br45z" event={"ID":"00ff63e7-883d-4e6d-985b-9122517d5081","Type":"ContainerDied","Data":"c5e695876117098c7aa3acb18071dce73b746830410f10fe488c54040af4236a"} Feb 02 14:23:19 crc kubenswrapper[4721]: I0202 14:23:19.746999 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2dxv/crc-debug-br45z" event={"ID":"00ff63e7-883d-4e6d-985b-9122517d5081","Type":"ContainerStarted","Data":"ab857196e7ef95f6d0e888d211bada36de4151ce2298d06ba4d3ef0ee53bfbe5"} Feb 02 14:23:19 crc kubenswrapper[4721]: I0202 14:23:19.788530 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2dxv/crc-debug-br45z"] Feb 02 14:23:19 crc kubenswrapper[4721]: I0202 14:23:19.798527 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2dxv/crc-debug-br45z"] Feb 02 14:23:21 crc kubenswrapper[4721]: I0202 14:23:21.296449 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:21 crc kubenswrapper[4721]: I0202 14:23:21.407925 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb746\" (UniqueName: \"kubernetes.io/projected/00ff63e7-883d-4e6d-985b-9122517d5081-kube-api-access-rb746\") pod \"00ff63e7-883d-4e6d-985b-9122517d5081\" (UID: \"00ff63e7-883d-4e6d-985b-9122517d5081\") " Feb 02 14:23:21 crc kubenswrapper[4721]: I0202 14:23:21.408046 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ff63e7-883d-4e6d-985b-9122517d5081-host\") pod \"00ff63e7-883d-4e6d-985b-9122517d5081\" (UID: \"00ff63e7-883d-4e6d-985b-9122517d5081\") " Feb 02 14:23:21 crc kubenswrapper[4721]: I0202 14:23:21.408138 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00ff63e7-883d-4e6d-985b-9122517d5081-host" (OuterVolumeSpecName: "host") pod "00ff63e7-883d-4e6d-985b-9122517d5081" (UID: "00ff63e7-883d-4e6d-985b-9122517d5081"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 14:23:21 crc kubenswrapper[4721]: I0202 14:23:21.408818 4721 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00ff63e7-883d-4e6d-985b-9122517d5081-host\") on node \"crc\" DevicePath \"\"" Feb 02 14:23:21 crc kubenswrapper[4721]: I0202 14:23:21.416363 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00ff63e7-883d-4e6d-985b-9122517d5081-kube-api-access-rb746" (OuterVolumeSpecName: "kube-api-access-rb746") pod "00ff63e7-883d-4e6d-985b-9122517d5081" (UID: "00ff63e7-883d-4e6d-985b-9122517d5081"). InnerVolumeSpecName "kube-api-access-rb746". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:23:21 crc kubenswrapper[4721]: I0202 14:23:21.510777 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb746\" (UniqueName: \"kubernetes.io/projected/00ff63e7-883d-4e6d-985b-9122517d5081-kube-api-access-rb746\") on node \"crc\" DevicePath \"\"" Feb 02 14:23:21 crc kubenswrapper[4721]: I0202 14:23:21.768804 4721 scope.go:117] "RemoveContainer" containerID="c5e695876117098c7aa3acb18071dce73b746830410f10fe488c54040af4236a" Feb 02 14:23:21 crc kubenswrapper[4721]: I0202 14:23:21.769551 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/crc-debug-br45z" Feb 02 14:23:22 crc kubenswrapper[4721]: I0202 14:23:22.425124 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00ff63e7-883d-4e6d-985b-9122517d5081" path="/var/lib/kubelet/pods/00ff63e7-883d-4e6d-985b-9122517d5081/volumes" Feb 02 14:24:14 crc kubenswrapper[4721]: I0202 14:24:14.530933 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57c58bbb98-gpbp2_183927fe-ec27-461b-8284-3e71f5cb666a/barbican-api/0.log" Feb 02 14:24:14 crc kubenswrapper[4721]: I0202 14:24:14.708017 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57c58bbb98-gpbp2_183927fe-ec27-461b-8284-3e71f5cb666a/barbican-api-log/0.log" Feb 02 14:24:14 crc kubenswrapper[4721]: I0202 14:24:14.723203 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f4497866b-px6fz_755b5957-fcfa-486a-8e63-d562742d6650/barbican-keystone-listener/0.log" Feb 02 14:24:14 crc kubenswrapper[4721]: I0202 14:24:14.804853 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f4497866b-px6fz_755b5957-fcfa-486a-8e63-d562742d6650/barbican-keystone-listener-log/0.log" Feb 02 14:24:14 crc kubenswrapper[4721]: I0202 14:24:14.962211 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d4595f9f9-4d2g5_93a7211b-9a15-4765-99e2-520bd1d62ff1/barbican-worker/0.log" Feb 02 14:24:15 crc kubenswrapper[4721]: I0202 14:24:15.004490 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d4595f9f9-4d2g5_93a7211b-9a15-4765-99e2-520bd1d62ff1/barbican-worker-log/0.log" Feb 02 14:24:15 crc kubenswrapper[4721]: I0202 14:24:15.408276 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c678f02b-cbee-4578-9e28-067b63af2682/ceilometer-central-agent/0.log" Feb 02 14:24:15 crc kubenswrapper[4721]: I0202 14:24:15.420355 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c678f02b-cbee-4578-9e28-067b63af2682/proxy-httpd/0.log" Feb 02 14:24:15 crc kubenswrapper[4721]: I0202 14:24:15.442662 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c678f02b-cbee-4578-9e28-067b63af2682/sg-core/0.log" Feb 02 14:24:15 crc kubenswrapper[4721]: I0202 14:24:15.467090 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_c678f02b-cbee-4578-9e28-067b63af2682/ceilometer-notification-agent/0.log" Feb 02 14:24:15 crc kubenswrapper[4721]: I0202 14:24:15.674355 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0eabfa0b-0304-4eda-8f8a-dc9160569e4b/cinder-api/0.log" Feb 02 14:24:15 crc kubenswrapper[4721]: I0202 14:24:15.777188 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0eabfa0b-0304-4eda-8f8a-dc9160569e4b/cinder-api-log/0.log" Feb 02 14:24:15 crc kubenswrapper[4721]: I0202 14:24:15.978540 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7d63f1df-bbdc-42ee-a234-2d691a3ce7ba/cinder-scheduler/0.log" Feb 02 14:24:16 crc kubenswrapper[4721]: I0202 14:24:16.046371 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7d63f1df-bbdc-42ee-a234-2d691a3ce7ba/probe/0.log" Feb 02 14:24:16 crc kubenswrapper[4721]: I0202 14:24:16.137182 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-fz8n5_1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b/init/0.log" Feb 02 14:24:16 crc kubenswrapper[4721]: I0202 14:24:16.344002 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-fz8n5_1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b/init/0.log" Feb 02 14:24:16 crc kubenswrapper[4721]: I0202 14:24:16.351158 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-fz8n5_1f4d7a08-5d54-424f-a9a6-e9ddd07b3b9b/dnsmasq-dns/0.log" Feb 02 14:24:16 crc kubenswrapper[4721]: I0202 14:24:16.440648 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_23e9328e-fd9a-4a87-946b-2c46e25bea51/glance-httpd/0.log" Feb 02 14:24:16 crc kubenswrapper[4721]: I0202 14:24:16.562191 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_23e9328e-fd9a-4a87-946b-2c46e25bea51/glance-log/0.log" Feb 02 14:24:16 crc kubenswrapper[4721]: I0202 14:24:16.656226 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f5129b1-fc26-40ba-9cf7-0f86e93507cd/glance-log/0.log" Feb 02 14:24:16 crc kubenswrapper[4721]: I0202 14:24:16.667052 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f5129b1-fc26-40ba-9cf7-0f86e93507cd/glance-httpd/0.log" Feb 02 14:24:17 crc kubenswrapper[4721]: I0202 14:24:17.248243 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-596786fd64-rpzql_6616d7ae-f7c6-4fe2-bf68-a0668a84fb8c/heat-engine/0.log" Feb 02 14:24:17 crc kubenswrapper[4721]: I0202 14:24:17.436491 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-86974d69bd-t6gcz_5d1412d5-76f7-4132-889d-f706432b3ecc/heat-api/0.log" Feb 02 14:24:17 crc kubenswrapper[4721]: I0202 14:24:17.498451 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-64c55c4cc7-4htzp_5c23b064-e24b-4ab3-886d-d731004b7479/heat-cfnapi/0.log" Feb 02 14:24:17 crc kubenswrapper[4721]: I0202 14:24:17.884780 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29500681-7xs4g_9b53b618-4727-4a17-a000-ba0ccd1084c1/keystone-cron/0.log" Feb 02 14:24:18 crc kubenswrapper[4721]: I0202 14:24:18.028144 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-784866f846-pjz9x_5883cb27-6bc8-4309-aeac-64a54a46eb89/keystone-api/0.log" Feb 02 14:24:18 crc kubenswrapper[4721]: I0202 14:24:18.134007 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ac827915-eefd-428b-9303-581069f92ed8/kube-state-metrics/0.log" Feb 02 14:24:18 crc kubenswrapper[4721]: I0202 14:24:18.433805 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_8abde028-43c5-4489-8de6-7c2da9f037c2/mysqld-exporter/0.log" Feb 02 14:24:18 crc kubenswrapper[4721]: I0202 14:24:18.735866 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-74cc678f5-fkzpw_40093ddb-a585-427d-88f6-110b4ea07578/neutron-api/0.log" Feb 02 14:24:18 crc kubenswrapper[4721]: I0202 14:24:18.799407 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-74cc678f5-fkzpw_40093ddb-a585-427d-88f6-110b4ea07578/neutron-httpd/0.log" Feb 02 14:24:19 crc kubenswrapper[4721]: I0202 14:24:19.190802 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8eccca7c-e269-4ecc-9fce-024196f66aaa/nova-api-log/0.log" Feb 02 14:24:19 crc kubenswrapper[4721]: I0202 14:24:19.355022 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_562aee22-e2a0-4706-b65a-7e9398823dec/nova-cell0-conductor-conductor/0.log" Feb 02 14:24:19 crc kubenswrapper[4721]: I0202 14:24:19.500874 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_8eccca7c-e269-4ecc-9fce-024196f66aaa/nova-api-api/0.log" Feb 02 14:24:20 crc kubenswrapper[4721]: I0202 14:24:20.052514 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_3830e692-ad9d-48c7-800f-dc63cadb2376/nova-cell1-conductor-conductor/0.log" Feb 02 14:24:20 crc kubenswrapper[4721]: I0202 14:24:20.147237 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ab8f3d4c-b262-4b71-a934-f584c1f07790/nova-cell1-novncproxy-novncproxy/0.log" Feb 02 14:24:20 crc kubenswrapper[4721]: I0202 14:24:20.298439 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f6e14b26-cab3-4acd-aad2-8cda004e0282/nova-metadata-log/0.log" Feb 02 14:24:20 crc kubenswrapper[4721]: I0202 14:24:20.608989 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d3e7848b-5b3e-4e6b-8c5e-82cd9f2f7728/nova-scheduler-scheduler/0.log" Feb 02 14:24:20 crc kubenswrapper[4721]: I0202 14:24:20.659623 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2505bd6b-64d4-4d17-9c1a-0e89562612be/mysql-bootstrap/0.log" Feb 02 14:24:20 crc kubenswrapper[4721]: I0202 14:24:20.939371 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2505bd6b-64d4-4d17-9c1a-0e89562612be/mysql-bootstrap/0.log" Feb 02 14:24:20 crc kubenswrapper[4721]: I0202 14:24:20.964686 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_2505bd6b-64d4-4d17-9c1a-0e89562612be/galera/0.log" Feb 02 14:24:21 crc kubenswrapper[4721]: I0202 14:24:21.197352 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4/mysql-bootstrap/0.log" Feb 02 14:24:21 crc kubenswrapper[4721]: I0202 14:24:21.446583 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4/mysql-bootstrap/0.log" Feb 02 14:24:21 crc kubenswrapper[4721]: I0202 14:24:21.499817 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_f5bedd25-b07d-44e9-a7e5-cfa263ad4fa4/galera/0.log" Feb 02 14:24:21 crc kubenswrapper[4721]: I0202 14:24:21.658570 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_32729b18-a175-4abd-a8cf-392d318b64d8/openstackclient/0.log" Feb 02 14:24:21 crc kubenswrapper[4721]: I0202 14:24:21.788813 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-l5h78_298ac2ef-6edb-40cb-bb92-8a8e039f333b/ovn-controller/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.041099 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hkwkv_753a63ae-970e-4dd1-a284-bc3b6027ca64/openstack-network-exporter/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.182763 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f6e14b26-cab3-4acd-aad2-8cda004e0282/nova-metadata-metadata/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.191566 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gz9nz_a75df612-e3f4-4ea3-bfc8-daceaf59205d/ovsdb-server-init/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.413399 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gz9nz_a75df612-e3f4-4ea3-bfc8-daceaf59205d/ovsdb-server/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.479913 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gz9nz_a75df612-e3f4-4ea3-bfc8-daceaf59205d/ovsdb-server-init/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.495028 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gz9nz_a75df612-e3f4-4ea3-bfc8-daceaf59205d/ovs-vswitchd/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.626468 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cd5938c1-e4b9-4437-a379-c25bc5b1c243/openstack-network-exporter/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.755808 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cd5938c1-e4b9-4437-a379-c25bc5b1c243/ovn-northd/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.872626 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_080bfc29-50bc-4ba1-b097-4f5c54586d8c/openstack-network-exporter/0.log" Feb 02 14:24:22 crc kubenswrapper[4721]: I0202 14:24:22.908355 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_080bfc29-50bc-4ba1-b097-4f5c54586d8c/ovsdbserver-nb/0.log" Feb 02 14:24:23 crc kubenswrapper[4721]: I0202 14:24:23.045665 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4e175d27-fe10-4fb7-9ce6-cb98379357cc/openstack-network-exporter/0.log" Feb 02 14:24:23 crc kubenswrapper[4721]: I0202 14:24:23.194554 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4e175d27-fe10-4fb7-9ce6-cb98379357cc/ovsdbserver-sb/0.log" Feb 02 14:24:23 crc kubenswrapper[4721]: I0202 14:24:23.345897 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6ccdcdf5fb-gncnr_8e3f4574-6ad6-4b37-abf5-2005c8692a44/placement-api/0.log" Feb 02 14:24:23 crc kubenswrapper[4721]: I0202 14:24:23.434117 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6ccdcdf5fb-gncnr_8e3f4574-6ad6-4b37-abf5-2005c8692a44/placement-log/0.log" Feb 02 14:24:23 crc kubenswrapper[4721]: I0202 14:24:23.449416 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6a34d077-087f-4b04-98c5-22e09450dcb3/init-config-reloader/0.log" Feb 02 14:24:23 crc kubenswrapper[4721]: I0202 14:24:23.937079 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6a34d077-087f-4b04-98c5-22e09450dcb3/init-config-reloader/0.log" Feb 02 14:24:23 crc kubenswrapper[4721]: I0202 14:24:23.976358 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6a34d077-087f-4b04-98c5-22e09450dcb3/thanos-sidecar/0.log" Feb 02 14:24:24 crc kubenswrapper[4721]: I0202 14:24:24.010044 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6a34d077-087f-4b04-98c5-22e09450dcb3/config-reloader/0.log" Feb 02 14:24:24 crc kubenswrapper[4721]: I0202 14:24:24.023647 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6a34d077-087f-4b04-98c5-22e09450dcb3/prometheus/0.log" Feb 02 14:24:24 crc kubenswrapper[4721]: I0202 14:24:24.269390 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_496bb19e-217b-4896-9bee-8082ac5da28b/setup-container/0.log" Feb 02 14:24:24 crc kubenswrapper[4721]: I0202 14:24:24.403561 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_496bb19e-217b-4896-9bee-8082ac5da28b/setup-container/0.log" Feb 02 14:24:24 crc kubenswrapper[4721]: I0202 14:24:24.435821 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_496bb19e-217b-4896-9bee-8082ac5da28b/rabbitmq/0.log" Feb 02 14:24:24 crc kubenswrapper[4721]: I0202 14:24:24.527682 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a57cea33-806c-4028-b59f-9f5e65289eac/setup-container/0.log" Feb 02 14:24:24 crc kubenswrapper[4721]: I0202 14:24:24.774818 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a57cea33-806c-4028-b59f-9f5e65289eac/setup-container/0.log" Feb 02 14:24:24 crc kubenswrapper[4721]: I0202 14:24:24.811028 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a57cea33-806c-4028-b59f-9f5e65289eac/rabbitmq/0.log" Feb 02 14:24:24 crc kubenswrapper[4721]: I0202 14:24:24.875987 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_4d21d961-1540-4610-89c0-ee265f66d728/setup-container/0.log" Feb 02 14:24:25 crc kubenswrapper[4721]: I0202 14:24:25.157748 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_b6dbd607-3fa8-48e0-b420-4e939a47c460/setup-container/0.log" Feb 02 14:24:25 crc kubenswrapper[4721]: I0202 14:24:25.177890 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_4d21d961-1540-4610-89c0-ee265f66d728/setup-container/0.log" Feb 02 14:24:25 crc kubenswrapper[4721]: I0202 14:24:25.198130 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_4d21d961-1540-4610-89c0-ee265f66d728/rabbitmq/0.log" Feb 02 14:24:25 crc kubenswrapper[4721]: I0202 14:24:25.452036 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_b6dbd607-3fa8-48e0-b420-4e939a47c460/setup-container/0.log" Feb 02 14:24:25 crc kubenswrapper[4721]: I0202 14:24:25.470699 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_b6dbd607-3fa8-48e0-b420-4e939a47c460/rabbitmq/0.log" Feb 02 14:24:25 crc kubenswrapper[4721]: I0202 14:24:25.615157 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9b87bd57c-2glsn_c04183e6-a1f0-4d8c-aa00-8dd660336a3b/proxy-httpd/0.log" Feb 02 14:24:25 crc kubenswrapper[4721]: I0202 14:24:25.642406 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9b87bd57c-2glsn_c04183e6-a1f0-4d8c-aa00-8dd660336a3b/proxy-server/0.log" Feb 02 14:24:25 crc kubenswrapper[4721]: I0202 14:24:25.743882 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-4rnrx_bd1f15d5-77dc-4b6d-81bf-c2a8286da820/swift-ring-rebalance/0.log" Feb 02 14:24:25 crc kubenswrapper[4721]: I0202 14:24:25.955245 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/account-auditor/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.010157 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/account-replicator/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.037505 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/account-reaper/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.124518 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/account-server/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.240912 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/container-auditor/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.271505 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/container-replicator/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.293380 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/container-server/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.331745 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/container-updater/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.485133 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/object-expirer/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.485414 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/object-auditor/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.572680 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/object-server/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.608857 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/object-replicator/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.740917 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/object-updater/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.775166 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/rsync/0.log" Feb 02 14:24:26 crc kubenswrapper[4721]: I0202 14:24:26.811101 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_eabe6b07-da9d-4980-99b4-12c02640c88d/swift-recon-cron/0.log" Feb 02 14:24:32 crc kubenswrapper[4721]: I0202 14:24:32.932172 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a686ac60-f231-4070-98c7-7acbc66c29d5/memcached/0.log" Feb 02 14:24:44 crc kubenswrapper[4721]: I0202 14:24:44.766434 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:24:44 crc kubenswrapper[4721]: I0202 14:24:44.767045 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:24:56 crc kubenswrapper[4721]: I0202 14:24:56.586718 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n_5c48ead1-c06b-4a13-b92a-ce7a474e6233/util/0.log" Feb 02 14:24:56 crc kubenswrapper[4721]: I0202 14:24:56.713544 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n_5c48ead1-c06b-4a13-b92a-ce7a474e6233/util/0.log" Feb 02 14:24:56 crc kubenswrapper[4721]: I0202 14:24:56.773338 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n_5c48ead1-c06b-4a13-b92a-ce7a474e6233/pull/0.log" Feb 02 14:24:56 crc kubenswrapper[4721]: I0202 14:24:56.796247 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n_5c48ead1-c06b-4a13-b92a-ce7a474e6233/pull/0.log" Feb 02 14:24:56 crc kubenswrapper[4721]: I0202 14:24:56.984668 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n_5c48ead1-c06b-4a13-b92a-ce7a474e6233/pull/0.log" Feb 02 14:24:57 crc kubenswrapper[4721]: I0202 14:24:57.005313 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n_5c48ead1-c06b-4a13-b92a-ce7a474e6233/util/0.log" Feb 02 14:24:57 crc kubenswrapper[4721]: I0202 14:24:57.006413 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_37793a4ef2e329483307c3ff79aaabd503a190102fbc19ed8ead42e2cc5zq4n_5c48ead1-c06b-4a13-b92a-ce7a474e6233/extract/0.log" Feb 02 14:24:57 crc kubenswrapper[4721]: I0202 14:24:57.264354 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-729mv_0562e590-1a66-4fbc-862d-833bc1600eac/manager/0.log" Feb 02 14:24:57 crc kubenswrapper[4721]: I0202 14:24:57.277677 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-8zlv5_0c1486a5-ee95-4cde-9631-3c7c7aa31ae7/manager/0.log" Feb 02 14:24:57 crc kubenswrapper[4721]: I0202 14:24:57.473200 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-s75st_23be57b1-6b3e-4346-93f9-2c45b0562d2b/manager/0.log" Feb 02 14:24:57 crc kubenswrapper[4721]: I0202 14:24:57.570201 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-q5lbf_20f771bf-d003-48b0-8e50-0d1217f24b45/manager/0.log" Feb 02 14:24:57 crc kubenswrapper[4721]: I0202 14:24:57.767632 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-sq5w5_e5a04e0d-8a73-4f21-a61d-374d7a5784fb/manager/0.log" Feb 02 14:24:57 crc kubenswrapper[4721]: I0202 14:24:57.870344 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-x6p4t_8a86dacc-de73-4b52-994c-3b089ee427cc/manager/0.log" Feb 02 14:24:58 crc kubenswrapper[4721]: I0202 14:24:58.250153 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-hktcl_9d11c3e4-10b4-4ff4-aaa2-04e342d984b4/manager/0.log" Feb 02 14:24:58 crc kubenswrapper[4721]: I0202 14:24:58.364091 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-qqpfm_a13f2341-6b53-4a7b-b67a-4a1d1846805d/manager/0.log" Feb 02 14:24:58 crc kubenswrapper[4721]: I0202 14:24:58.632168 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-5x28t_39686eda-a258-408b-bf9c-7ff7d515ed9d/manager/0.log" Feb 02 14:24:58 crc kubenswrapper[4721]: I0202 14:24:58.633630 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-5vbh8_b9eeabcd-14ef-4800-9f0c-1a3cd515d2aa/manager/0.log" Feb 02 14:24:58 crc kubenswrapper[4721]: I0202 14:24:58.871696 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-ct6hc_1c4864d3-2fdd-4b98-ac89-aefb49b56187/manager/0.log" Feb 02 14:24:58 crc kubenswrapper[4721]: I0202 14:24:58.925040 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-sjvjw_2d60d537-ea47-42fa-94c3-61704aef0678/manager/0.log" Feb 02 14:24:59 crc kubenswrapper[4721]: I0202 14:24:59.136007 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-6z258_1f3087b4-acf0-4a27-9696-bdfb4728e96c/manager/0.log" Feb 02 14:24:59 crc kubenswrapper[4721]: I0202 14:24:59.145891 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-42qq8_dc736681-960e-4f76-bc10-25f529da020a/manager/0.log" Feb 02 14:24:59 crc kubenswrapper[4721]: I0202 14:24:59.321221 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4dtgpkw_ae636942-3520-410e-b70a-b4fc19a527ca/manager/0.log" Feb 02 14:24:59 crc kubenswrapper[4721]: I0202 14:24:59.521785 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-b64b9f5cb-mqpbl_e4514067-762e-4638-ad5a-a7d17297bc0d/operator/0.log" Feb 02 14:24:59 crc kubenswrapper[4721]: I0202 14:24:59.823918 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-lxqsx_abf13eed-433d-4afa-809d-bd863e469366/registry-server/0.log" Feb 02 14:25:00 crc kubenswrapper[4721]: I0202 14:25:00.069262 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-rzjts_6b33adce-a49a-4ce2-af29-412661aaf062/manager/0.log" Feb 02 14:25:00 crc kubenswrapper[4721]: I0202 14:25:00.229767 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-kqdjm_ed67384c-22d3-4466-8990-744b122efbf4/manager/0.log" Feb 02 14:25:00 crc kubenswrapper[4721]: I0202 14:25:00.507250 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5d6c59fb84-5s25p_55bc1d80-1d29-4e15-baca-49eee6fd3aa5/manager/0.log" Feb 02 14:25:00 crc kubenswrapper[4721]: I0202 14:25:00.514271 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xwkhz_56b67b2b-b9fd-4353-88e3-d4f1d44653e2/operator/0.log" Feb 02 14:25:01 crc kubenswrapper[4721]: I0202 14:25:01.193453 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-79zrv_499ca4ef-3867-407b-ab4a-64fff307e296/manager/0.log" Feb 02 14:25:01 crc kubenswrapper[4721]: I0202 14:25:01.197490 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-2828d_60ff9309-fd37-4618-b4f0-38704a558ec0/manager/0.log" Feb 02 14:25:01 crc kubenswrapper[4721]: I0202 14:25:01.293447 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5b9ffd7d65-rgkhb_79e5221b-04ee-496d-82b7-16af5b340595/manager/0.log" Feb 02 14:25:01 crc kubenswrapper[4721]: I0202 14:25:01.440289 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-4pk6v_058f996d-8009-4f83-864d-177f7b577cf0/manager/0.log" Feb 02 14:25:14 crc kubenswrapper[4721]: I0202 14:25:14.764127 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:25:14 crc kubenswrapper[4721]: I0202 14:25:14.764565 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:25:24 crc kubenswrapper[4721]: I0202 14:25:24.320213 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jdzwk_46f85b66-5a30-4bef-909c-26750b18e72d/control-plane-machine-set-operator/0.log" Feb 02 14:25:24 crc kubenswrapper[4721]: I0202 14:25:24.522579 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pnfph_083f0d8a-e0c4-46ae-8993-8547dd260553/kube-rbac-proxy/0.log" Feb 02 14:25:24 crc kubenswrapper[4721]: I0202 14:25:24.597762 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pnfph_083f0d8a-e0c4-46ae-8993-8547dd260553/machine-api-operator/0.log" Feb 02 14:25:40 crc kubenswrapper[4721]: I0202 14:25:40.169831 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-qxmnx_caf3dcb7-c58d-4d36-9329-c9b8d3c354a8/cert-manager-controller/0.log" Feb 02 14:25:40 crc kubenswrapper[4721]: I0202 14:25:40.357869 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-k5vmr_988d3eab-804d-4db0-8855-b63ebbeabce4/cert-manager-cainjector/0.log" Feb 02 14:25:40 crc kubenswrapper[4721]: I0202 14:25:40.432730 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-xjhrq_aff1475e-36c5-471a-b04e-01cefc2d2763/cert-manager-webhook/0.log" Feb 02 14:25:44 crc kubenswrapper[4721]: I0202 14:25:44.763558 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:25:44 crc kubenswrapper[4721]: I0202 14:25:44.764201 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:25:44 crc kubenswrapper[4721]: I0202 14:25:44.764248 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 14:25:44 crc kubenswrapper[4721]: I0202 14:25:44.765132 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80a4d97025e59da4be229a82e47246c7035ccedac52df0fa5679fdc8c1c7b8ec"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 14:25:44 crc kubenswrapper[4721]: I0202 14:25:44.765205 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://80a4d97025e59da4be229a82e47246c7035ccedac52df0fa5679fdc8c1c7b8ec" gracePeriod=600 Feb 02 14:25:45 crc kubenswrapper[4721]: I0202 14:25:45.341722 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="80a4d97025e59da4be229a82e47246c7035ccedac52df0fa5679fdc8c1c7b8ec" exitCode=0 Feb 02 14:25:45 crc kubenswrapper[4721]: I0202 14:25:45.341778 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"80a4d97025e59da4be229a82e47246c7035ccedac52df0fa5679fdc8c1c7b8ec"} Feb 02 14:25:45 crc kubenswrapper[4721]: I0202 14:25:45.342144 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a"} Feb 02 14:25:45 crc kubenswrapper[4721]: I0202 14:25:45.342185 4721 scope.go:117] "RemoveContainer" containerID="6f7ac9e0d3ce5a3abf70aaaf35d53db0f85d58208ac3969dbebc6edfb6879ffd" Feb 02 14:25:56 crc kubenswrapper[4721]: I0202 14:25:56.798721 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-2f9ms_b15ef257-c4ff-4fd9-a04c-a92d38e51b18/nmstate-console-plugin/0.log" Feb 02 14:25:57 crc kubenswrapper[4721]: I0202 14:25:57.001270 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dlvcq_1cf5f077-bb9b-42de-ab25-70b762c3e2e1/nmstate-handler/0.log" Feb 02 14:25:57 crc kubenswrapper[4721]: I0202 14:25:57.010225 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-mmg2n_be1a5420-ea1d-40e0-bd09-241151dc6755/kube-rbac-proxy/0.log" Feb 02 14:25:57 crc kubenswrapper[4721]: I0202 14:25:57.139195 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-mmg2n_be1a5420-ea1d-40e0-bd09-241151dc6755/nmstate-metrics/0.log" Feb 02 14:25:57 crc kubenswrapper[4721]: I0202 14:25:57.258987 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-trhxn_38f375ca-8f76-4eb1-a92d-d46f7628ecf6/nmstate-operator/0.log" Feb 02 14:25:57 crc kubenswrapper[4721]: I0202 14:25:57.311786 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-j4jzl_92d17aed-5894-45b3-8fe9-08b5dfc7c702/nmstate-webhook/0.log" Feb 02 14:26:14 crc kubenswrapper[4721]: I0202 14:26:14.353513 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-756566789b-zpsf5_af1123f2-fce6-410b-a82b-9b292bb8bf68/manager/0.log" Feb 02 14:26:14 crc kubenswrapper[4721]: I0202 14:26:14.423637 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-756566789b-zpsf5_af1123f2-fce6-410b-a82b-9b292bb8bf68/kube-rbac-proxy/0.log" Feb 02 14:26:30 crc kubenswrapper[4721]: I0202 14:26:30.920520 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-jltbt_30a5c0d6-c773-4914-a3b1-1654a51817a9/prometheus-operator/0.log" Feb 02 14:26:31 crc kubenswrapper[4721]: I0202 14:26:31.130468 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_a1f92c36-0e50-485e-a728-7b42f1ab44c4/prometheus-operator-admission-webhook/0.log" Feb 02 14:26:31 crc kubenswrapper[4721]: I0202 14:26:31.183973 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_41c83c75-cfc8-4c33-97cf-484cc7dcd812/prometheus-operator-admission-webhook/0.log" Feb 02 14:26:31 crc kubenswrapper[4721]: I0202 14:26:31.390382 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-b2dk9_7ac0e2d1-4762-4c40-84c9-db0bde4f956f/operator/0.log" Feb 02 14:26:31 crc kubenswrapper[4721]: I0202 14:26:31.417099 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-6lvhx_6064a9a4-2316-4bdd-abf1-934e9167528a/observability-ui-dashboards/0.log" Feb 02 14:26:31 crc kubenswrapper[4721]: I0202 14:26:31.639456 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-5w6sx_a3affad2-ab35-4604-8239-56f69bf3727f/perses-operator/0.log" Feb 02 14:26:50 crc kubenswrapper[4721]: I0202 14:26:50.334492 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-79cf69ddc8-9sfnv_f9c5d281-206d-4729-a031-feb5b9234c8f/cluster-logging-operator/0.log" Feb 02 14:26:50 crc kubenswrapper[4721]: I0202 14:26:50.543177 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-ls7f7_749232df-9bfe-43cb-a716-6eadd2cbc290/collector/0.log" Feb 02 14:26:50 crc kubenswrapper[4721]: I0202 14:26:50.594233 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_2cb9902a-5fe1-42ee-a659-eebccc3aec15/loki-compactor/0.log" Feb 02 14:26:50 crc kubenswrapper[4721]: I0202 14:26:50.740909 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-5f678c8dd6-tb6gs_7a392a7d-824d-420d-bf0d-66ca95134ea6/loki-distributor/0.log" Feb 02 14:26:50 crc kubenswrapper[4721]: I0202 14:26:50.799374 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5f86bf5685-lsthj_6bbaf0c4-9bfc-4cf9-b238-4f494e492243/gateway/0.log" Feb 02 14:26:50 crc kubenswrapper[4721]: I0202 14:26:50.850466 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5f86bf5685-lsthj_6bbaf0c4-9bfc-4cf9-b238-4f494e492243/opa/0.log" Feb 02 14:26:50 crc kubenswrapper[4721]: I0202 14:26:50.966615 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5f86bf5685-p62nr_e0a2094f-7b9c-426c-b7ea-6a175be407f1/gateway/0.log" Feb 02 14:26:51 crc kubenswrapper[4721]: I0202 14:26:51.006687 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-5f86bf5685-p62nr_e0a2094f-7b9c-426c-b7ea-6a175be407f1/opa/0.log" Feb 02 14:26:51 crc kubenswrapper[4721]: I0202 14:26:51.140513 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_05f7eb9f-7ce9-4d66-b8e1-cc9eb0c1949e/loki-index-gateway/0.log" Feb 02 14:26:51 crc kubenswrapper[4721]: I0202 14:26:51.938312 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-76788598db-mnp7c_98490098-f31f-4ee3-9f15-ee37b8740035/loki-querier/0.log" Feb 02 14:26:51 crc kubenswrapper[4721]: I0202 14:26:51.957303 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_b3605888-b0c3-4049-8f6a-cd4f380b91a7/loki-ingester/0.log" Feb 02 14:26:52 crc kubenswrapper[4721]: I0202 14:26:52.150211 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-69d9546745-xs62z_93d83a8b-3334-43f3-b417-58a7fbd7282c/loki-query-frontend/0.log" Feb 02 14:27:06 crc kubenswrapper[4721]: I0202 14:27:06.707811 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-rq76j_d8fb94c8-b6a7-47c1-bf64-c01350b47983/kube-rbac-proxy/0.log" Feb 02 14:27:06 crc kubenswrapper[4721]: I0202 14:27:06.996767 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-rq76j_d8fb94c8-b6a7-47c1-bf64-c01350b47983/controller/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.073082 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-frr-files/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.241180 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-frr-files/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.246714 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-reloader/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.261341 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-metrics/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.340790 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-reloader/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.527825 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-reloader/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.534599 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-metrics/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.551129 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-frr-files/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.556709 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-metrics/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.778155 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-metrics/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.795991 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-frr-files/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.820404 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/controller/0.log" Feb 02 14:27:07 crc kubenswrapper[4721]: I0202 14:27:07.830738 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/cp-reloader/0.log" Feb 02 14:27:08 crc kubenswrapper[4721]: I0202 14:27:08.029669 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/frr-metrics/0.log" Feb 02 14:27:08 crc kubenswrapper[4721]: I0202 14:27:08.054756 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/kube-rbac-proxy/0.log" Feb 02 14:27:08 crc kubenswrapper[4721]: I0202 14:27:08.097575 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/kube-rbac-proxy-frr/0.log" Feb 02 14:27:08 crc kubenswrapper[4721]: I0202 14:27:08.291186 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/reloader/0.log" Feb 02 14:27:08 crc kubenswrapper[4721]: I0202 14:27:08.383752 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-4t8pn_4fda33e0-d0a3-4266-aeb1-fc07965d8c35/frr-k8s-webhook-server/0.log" Feb 02 14:27:08 crc kubenswrapper[4721]: I0202 14:27:08.618791 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-67895b6557-xpzcz_4c6e741b-2539-4be0-898c-5fee37f67d21/manager/0.log" Feb 02 14:27:08 crc kubenswrapper[4721]: I0202 14:27:08.788578 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-bb6bc86c6-l2cpd_10a7b124-f250-42d3-9e7c-af29d7204edb/webhook-server/0.log" Feb 02 14:27:08 crc kubenswrapper[4721]: I0202 14:27:08.937436 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2hhvl_486fb2e8-15fe-46c1-b62c-89f2b2abf064/kube-rbac-proxy/0.log" Feb 02 14:27:09 crc kubenswrapper[4721]: I0202 14:27:09.415044 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8ts6n_5f685485-23a9-45dd-90cd-62ab47eab713/frr/0.log" Feb 02 14:27:09 crc kubenswrapper[4721]: I0202 14:27:09.695190 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2hhvl_486fb2e8-15fe-46c1-b62c-89f2b2abf064/speaker/0.log" Feb 02 14:27:24 crc kubenswrapper[4721]: I0202 14:27:24.308814 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7_53fd5f54-e7fe-4d86-a5b7-3583e945fff3/util/0.log" Feb 02 14:27:24 crc kubenswrapper[4721]: I0202 14:27:24.444671 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7_53fd5f54-e7fe-4d86-a5b7-3583e945fff3/util/0.log" Feb 02 14:27:24 crc kubenswrapper[4721]: I0202 14:27:24.503041 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7_53fd5f54-e7fe-4d86-a5b7-3583e945fff3/pull/0.log" Feb 02 14:27:24 crc kubenswrapper[4721]: I0202 14:27:24.516778 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7_53fd5f54-e7fe-4d86-a5b7-3583e945fff3/pull/0.log" Feb 02 14:27:24 crc kubenswrapper[4721]: I0202 14:27:24.770666 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7_53fd5f54-e7fe-4d86-a5b7-3583e945fff3/pull/0.log" Feb 02 14:27:24 crc kubenswrapper[4721]: I0202 14:27:24.787373 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7_53fd5f54-e7fe-4d86-a5b7-3583e945fff3/util/0.log" Feb 02 14:27:24 crc kubenswrapper[4721]: I0202 14:27:24.839764 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2xhtf7_53fd5f54-e7fe-4d86-a5b7-3583e945fff3/extract/0.log" Feb 02 14:27:24 crc kubenswrapper[4721]: I0202 14:27:24.978674 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg_6f007b81-04cd-4913-ad24-51aa6c5b60c8/util/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.161186 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg_6f007b81-04cd-4913-ad24-51aa6c5b60c8/pull/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.169680 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg_6f007b81-04cd-4913-ad24-51aa6c5b60c8/pull/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.239598 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg_6f007b81-04cd-4913-ad24-51aa6c5b60c8/util/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.438306 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg_6f007b81-04cd-4913-ad24-51aa6c5b60c8/util/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.446900 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg_6f007b81-04cd-4913-ad24-51aa6c5b60c8/extract/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.463701 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dck9cgg_6f007b81-04cd-4913-ad24-51aa6c5b60c8/pull/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.643546 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n_4b23cf05-2074-4e51-b6ef-235b207d8b16/util/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.807544 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n_4b23cf05-2074-4e51-b6ef-235b207d8b16/util/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.814859 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n_4b23cf05-2074-4e51-b6ef-235b207d8b16/pull/0.log" Feb 02 14:27:25 crc kubenswrapper[4721]: I0202 14:27:25.854602 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n_4b23cf05-2074-4e51-b6ef-235b207d8b16/pull/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.027144 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n_4b23cf05-2074-4e51-b6ef-235b207d8b16/util/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.071642 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n_4b23cf05-2074-4e51-b6ef-235b207d8b16/extract/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.088752 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360b8hh8n_4b23cf05-2074-4e51-b6ef-235b207d8b16/pull/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.234402 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl_3d506202-dc87-49f6-9160-ccedb0cbae19/util/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.492782 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl_3d506202-dc87-49f6-9160-ccedb0cbae19/util/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.511734 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl_3d506202-dc87-49f6-9160-ccedb0cbae19/pull/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.522957 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl_3d506202-dc87-49f6-9160-ccedb0cbae19/pull/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.648744 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl_3d506202-dc87-49f6-9160-ccedb0cbae19/util/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.695108 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl_3d506202-dc87-49f6-9160-ccedb0cbae19/extract/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.702961 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7138j6sl_3d506202-dc87-49f6-9160-ccedb0cbae19/pull/0.log" Feb 02 14:27:26 crc kubenswrapper[4721]: I0202 14:27:26.872619 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj_6b6f1f89-2c62-4c26-abd3-2d105289fc8c/util/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.023576 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj_6b6f1f89-2c62-4c26-abd3-2d105289fc8c/pull/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.025651 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj_6b6f1f89-2c62-4c26-abd3-2d105289fc8c/util/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.063815 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj_6b6f1f89-2c62-4c26-abd3-2d105289fc8c/pull/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.300389 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj_6b6f1f89-2c62-4c26-abd3-2d105289fc8c/util/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.301601 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj_6b6f1f89-2c62-4c26-abd3-2d105289fc8c/extract/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.335142 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08ct9cj_6b6f1f89-2c62-4c26-abd3-2d105289fc8c/pull/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.540389 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tgnrr_ff82afec-f54e-4b47-8399-fd27b44558d3/extract-utilities/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.742479 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tgnrr_ff82afec-f54e-4b47-8399-fd27b44558d3/extract-content/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.756530 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tgnrr_ff82afec-f54e-4b47-8399-fd27b44558d3/extract-content/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.758800 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tgnrr_ff82afec-f54e-4b47-8399-fd27b44558d3/extract-utilities/0.log" Feb 02 14:27:27 crc kubenswrapper[4721]: I0202 14:27:27.974149 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tgnrr_ff82afec-f54e-4b47-8399-fd27b44558d3/extract-utilities/0.log" Feb 02 14:27:28 crc kubenswrapper[4721]: I0202 14:27:28.001512 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tgnrr_ff82afec-f54e-4b47-8399-fd27b44558d3/extract-content/0.log" Feb 02 14:27:28 crc kubenswrapper[4721]: I0202 14:27:28.287000 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kv46m_be9ad0b8-eef7-451f-82b9-1b5cc54c63c2/extract-utilities/0.log" Feb 02 14:27:28 crc kubenswrapper[4721]: I0202 14:27:28.557511 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kv46m_be9ad0b8-eef7-451f-82b9-1b5cc54c63c2/extract-content/0.log" Feb 02 14:27:28 crc kubenswrapper[4721]: I0202 14:27:28.593980 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kv46m_be9ad0b8-eef7-451f-82b9-1b5cc54c63c2/extract-content/0.log" Feb 02 14:27:28 crc kubenswrapper[4721]: I0202 14:27:28.594331 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kv46m_be9ad0b8-eef7-451f-82b9-1b5cc54c63c2/extract-utilities/0.log" Feb 02 14:27:28 crc kubenswrapper[4721]: I0202 14:27:28.775637 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-tgnrr_ff82afec-f54e-4b47-8399-fd27b44558d3/registry-server/0.log" Feb 02 14:27:28 crc kubenswrapper[4721]: I0202 14:27:28.786758 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kv46m_be9ad0b8-eef7-451f-82b9-1b5cc54c63c2/extract-utilities/0.log" Feb 02 14:27:28 crc kubenswrapper[4721]: I0202 14:27:28.858616 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kv46m_be9ad0b8-eef7-451f-82b9-1b5cc54c63c2/extract-content/0.log" Feb 02 14:27:28 crc kubenswrapper[4721]: I0202 14:27:28.995272 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-wdnhz_884fbbc4-b86d-4f88-9fc6-2aa2015b81d3/marketplace-operator/0.log" Feb 02 14:27:29 crc kubenswrapper[4721]: I0202 14:27:29.095765 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c64xc_8851d4c5-8c20-440c-bb07-d7542ea1620d/extract-utilities/0.log" Feb 02 14:27:29 crc kubenswrapper[4721]: I0202 14:27:29.280998 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c64xc_8851d4c5-8c20-440c-bb07-d7542ea1620d/extract-utilities/0.log" Feb 02 14:27:29 crc kubenswrapper[4721]: I0202 14:27:29.329798 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c64xc_8851d4c5-8c20-440c-bb07-d7542ea1620d/extract-content/0.log" Feb 02 14:27:29 crc kubenswrapper[4721]: I0202 14:27:29.412789 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c64xc_8851d4c5-8c20-440c-bb07-d7542ea1620d/extract-content/0.log" Feb 02 14:27:29 crc kubenswrapper[4721]: I0202 14:27:29.568762 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c64xc_8851d4c5-8c20-440c-bb07-d7542ea1620d/extract-utilities/0.log" Feb 02 14:27:29 crc kubenswrapper[4721]: I0202 14:27:29.658468 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c64xc_8851d4c5-8c20-440c-bb07-d7542ea1620d/extract-content/0.log" Feb 02 14:27:29 crc kubenswrapper[4721]: I0202 14:27:29.867736 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q6jkv_7625a6ea-aff2-4a16-a62a-fec198126d2f/extract-utilities/0.log" Feb 02 14:27:29 crc kubenswrapper[4721]: I0202 14:27:29.895624 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kv46m_be9ad0b8-eef7-451f-82b9-1b5cc54c63c2/registry-server/0.log" Feb 02 14:27:30 crc kubenswrapper[4721]: I0202 14:27:30.007034 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c64xc_8851d4c5-8c20-440c-bb07-d7542ea1620d/registry-server/0.log" Feb 02 14:27:30 crc kubenswrapper[4721]: I0202 14:27:30.143812 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q6jkv_7625a6ea-aff2-4a16-a62a-fec198126d2f/extract-utilities/0.log" Feb 02 14:27:30 crc kubenswrapper[4721]: I0202 14:27:30.154342 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q6jkv_7625a6ea-aff2-4a16-a62a-fec198126d2f/extract-content/0.log" Feb 02 14:27:30 crc kubenswrapper[4721]: I0202 14:27:30.189704 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q6jkv_7625a6ea-aff2-4a16-a62a-fec198126d2f/extract-content/0.log" Feb 02 14:27:30 crc kubenswrapper[4721]: I0202 14:27:30.371139 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q6jkv_7625a6ea-aff2-4a16-a62a-fec198126d2f/extract-utilities/0.log" Feb 02 14:27:30 crc kubenswrapper[4721]: I0202 14:27:30.371630 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q6jkv_7625a6ea-aff2-4a16-a62a-fec198126d2f/extract-content/0.log" Feb 02 14:27:30 crc kubenswrapper[4721]: I0202 14:27:30.726967 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q6jkv_7625a6ea-aff2-4a16-a62a-fec198126d2f/registry-server/0.log" Feb 02 14:27:44 crc kubenswrapper[4721]: I0202 14:27:44.148428 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-567c77c7c4-mwhgx_a1f92c36-0e50-485e-a728-7b42f1ab44c4/prometheus-operator-admission-webhook/0.log" Feb 02 14:27:44 crc kubenswrapper[4721]: I0202 14:27:44.164672 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-jltbt_30a5c0d6-c773-4914-a3b1-1654a51817a9/prometheus-operator/0.log" Feb 02 14:27:44 crc kubenswrapper[4721]: I0202 14:27:44.181673 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-567c77c7c4-zfrs9_41c83c75-cfc8-4c33-97cf-484cc7dcd812/prometheus-operator-admission-webhook/0.log" Feb 02 14:27:44 crc kubenswrapper[4721]: I0202 14:27:44.357982 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-b2dk9_7ac0e2d1-4762-4c40-84c9-db0bde4f956f/operator/0.log" Feb 02 14:27:44 crc kubenswrapper[4721]: I0202 14:27:44.390328 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-6lvhx_6064a9a4-2316-4bdd-abf1-934e9167528a/observability-ui-dashboards/0.log" Feb 02 14:27:44 crc kubenswrapper[4721]: I0202 14:27:44.427657 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-5w6sx_a3affad2-ab35-4604-8239-56f69bf3727f/perses-operator/0.log" Feb 02 14:27:58 crc kubenswrapper[4721]: I0202 14:27:58.983578 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-756566789b-zpsf5_af1123f2-fce6-410b-a82b-9b292bb8bf68/kube-rbac-proxy/0.log" Feb 02 14:27:59 crc kubenswrapper[4721]: I0202 14:27:59.076296 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-756566789b-zpsf5_af1123f2-fce6-410b-a82b-9b292bb8bf68/manager/0.log" Feb 02 14:28:11 crc kubenswrapper[4721]: E0202 14:28:11.149611 4721 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.247:51496->38.129.56.247:38909: write tcp 38.129.56.247:51496->38.129.56.247:38909: write: broken pipe Feb 02 14:28:14 crc kubenswrapper[4721]: I0202 14:28:14.764207 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:28:14 crc kubenswrapper[4721]: I0202 14:28:14.764841 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:28:44 crc kubenswrapper[4721]: I0202 14:28:44.763351 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:28:44 crc kubenswrapper[4721]: I0202 14:28:44.763986 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.564225 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x6zb9"] Feb 02 14:28:48 crc kubenswrapper[4721]: E0202 14:28:48.565340 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00ff63e7-883d-4e6d-985b-9122517d5081" containerName="container-00" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.565354 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="00ff63e7-883d-4e6d-985b-9122517d5081" containerName="container-00" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.565572 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="00ff63e7-883d-4e6d-985b-9122517d5081" containerName="container-00" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.567197 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.574696 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6zb9"] Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.616729 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-utilities\") pod \"certified-operators-x6zb9\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.617091 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-catalog-content\") pod \"certified-operators-x6zb9\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.617331 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvjj\" (UniqueName: \"kubernetes.io/projected/aa73e29e-aeec-4257-abab-cc99e8e99afa-kube-api-access-tfvjj\") pod \"certified-operators-x6zb9\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.719997 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-utilities\") pod \"certified-operators-x6zb9\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.720209 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-catalog-content\") pod \"certified-operators-x6zb9\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.720349 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvjj\" (UniqueName: \"kubernetes.io/projected/aa73e29e-aeec-4257-abab-cc99e8e99afa-kube-api-access-tfvjj\") pod \"certified-operators-x6zb9\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.720689 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-utilities\") pod \"certified-operators-x6zb9\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.720753 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-catalog-content\") pod \"certified-operators-x6zb9\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.757122 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jrptg"] Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.760380 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.761172 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvjj\" (UniqueName: \"kubernetes.io/projected/aa73e29e-aeec-4257-abab-cc99e8e99afa-kube-api-access-tfvjj\") pod \"certified-operators-x6zb9\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.773679 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jrptg"] Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.822630 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzcxd\" (UniqueName: \"kubernetes.io/projected/b2305307-fc37-4522-abb5-6dc428e94e61-kube-api-access-lzcxd\") pod \"community-operators-jrptg\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.822710 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-utilities\") pod \"community-operators-jrptg\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.822781 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-catalog-content\") pod \"community-operators-jrptg\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.890756 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.926262 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzcxd\" (UniqueName: \"kubernetes.io/projected/b2305307-fc37-4522-abb5-6dc428e94e61-kube-api-access-lzcxd\") pod \"community-operators-jrptg\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.926632 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-utilities\") pod \"community-operators-jrptg\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.926738 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-catalog-content\") pod \"community-operators-jrptg\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.927485 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-utilities\") pod \"community-operators-jrptg\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.927500 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-catalog-content\") pod \"community-operators-jrptg\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:48 crc kubenswrapper[4721]: I0202 14:28:48.954111 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzcxd\" (UniqueName: \"kubernetes.io/projected/b2305307-fc37-4522-abb5-6dc428e94e61-kube-api-access-lzcxd\") pod \"community-operators-jrptg\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:49 crc kubenswrapper[4721]: I0202 14:28:49.154865 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:49 crc kubenswrapper[4721]: I0202 14:28:49.719383 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6zb9"] Feb 02 14:28:49 crc kubenswrapper[4721]: I0202 14:28:49.945509 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jrptg"] Feb 02 14:28:50 crc kubenswrapper[4721]: I0202 14:28:50.551810 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6zb9" event={"ID":"aa73e29e-aeec-4257-abab-cc99e8e99afa","Type":"ContainerStarted","Data":"c5d72c4cb5231aaa48bd8c849bdb7a55ce6b86d4b2df1ba54009fff2f035fd3c"} Feb 02 14:28:50 crc kubenswrapper[4721]: I0202 14:28:50.553507 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrptg" event={"ID":"b2305307-fc37-4522-abb5-6dc428e94e61","Type":"ContainerStarted","Data":"037261c6a323f6dbd716adcfb2e228053f7afc8794d4b86219c39f4889d652e7"} Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.164380 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gqkjm"] Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.167853 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.179711 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqkjm"] Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.265781 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-utilities\") pod \"redhat-operators-gqkjm\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.265982 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-catalog-content\") pod \"redhat-operators-gqkjm\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.266026 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdrfd\" (UniqueName: \"kubernetes.io/projected/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-kube-api-access-mdrfd\") pod \"redhat-operators-gqkjm\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.369102 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-catalog-content\") pod \"redhat-operators-gqkjm\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.369553 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdrfd\" (UniqueName: \"kubernetes.io/projected/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-kube-api-access-mdrfd\") pod \"redhat-operators-gqkjm\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.369673 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-catalog-content\") pod \"redhat-operators-gqkjm\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.369710 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-utilities\") pod \"redhat-operators-gqkjm\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.369989 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-utilities\") pod \"redhat-operators-gqkjm\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.391185 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdrfd\" (UniqueName: \"kubernetes.io/projected/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-kube-api-access-mdrfd\") pod \"redhat-operators-gqkjm\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.511321 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.566475 4721 generic.go:334] "Generic (PLEG): container finished" podID="b2305307-fc37-4522-abb5-6dc428e94e61" containerID="45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477" exitCode=0 Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.566527 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrptg" event={"ID":"b2305307-fc37-4522-abb5-6dc428e94e61","Type":"ContainerDied","Data":"45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477"} Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.569435 4721 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.570835 4721 generic.go:334] "Generic (PLEG): container finished" podID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerID="2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5" exitCode=0 Feb 02 14:28:51 crc kubenswrapper[4721]: I0202 14:28:51.570865 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6zb9" event={"ID":"aa73e29e-aeec-4257-abab-cc99e8e99afa","Type":"ContainerDied","Data":"2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5"} Feb 02 14:28:52 crc kubenswrapper[4721]: I0202 14:28:52.017826 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqkjm"] Feb 02 14:28:52 crc kubenswrapper[4721]: W0202 14:28:52.030507 4721 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef3b13ac_f3ee_4b4b_a8eb_365a926869e6.slice/crio-15ed5f7bb533c11c83fb5d5905a7a20f12acc04f97e988e956205502593356c9 WatchSource:0}: Error finding container 15ed5f7bb533c11c83fb5d5905a7a20f12acc04f97e988e956205502593356c9: Status 404 returned error can't find the container with id 15ed5f7bb533c11c83fb5d5905a7a20f12acc04f97e988e956205502593356c9 Feb 02 14:28:52 crc kubenswrapper[4721]: I0202 14:28:52.584322 4721 generic.go:334] "Generic (PLEG): container finished" podID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerID="f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1" exitCode=0 Feb 02 14:28:52 crc kubenswrapper[4721]: I0202 14:28:52.584522 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqkjm" event={"ID":"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6","Type":"ContainerDied","Data":"f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1"} Feb 02 14:28:52 crc kubenswrapper[4721]: I0202 14:28:52.584644 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqkjm" event={"ID":"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6","Type":"ContainerStarted","Data":"15ed5f7bb533c11c83fb5d5905a7a20f12acc04f97e988e956205502593356c9"} Feb 02 14:28:53 crc kubenswrapper[4721]: I0202 14:28:53.603943 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6zb9" event={"ID":"aa73e29e-aeec-4257-abab-cc99e8e99afa","Type":"ContainerStarted","Data":"c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b"} Feb 02 14:28:53 crc kubenswrapper[4721]: I0202 14:28:53.610509 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrptg" event={"ID":"b2305307-fc37-4522-abb5-6dc428e94e61","Type":"ContainerStarted","Data":"dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4"} Feb 02 14:28:54 crc kubenswrapper[4721]: I0202 14:28:54.622451 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqkjm" event={"ID":"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6","Type":"ContainerStarted","Data":"bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9"} Feb 02 14:28:54 crc kubenswrapper[4721]: I0202 14:28:54.625958 4721 generic.go:334] "Generic (PLEG): container finished" podID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerID="c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b" exitCode=0 Feb 02 14:28:54 crc kubenswrapper[4721]: I0202 14:28:54.626033 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6zb9" event={"ID":"aa73e29e-aeec-4257-abab-cc99e8e99afa","Type":"ContainerDied","Data":"c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b"} Feb 02 14:28:54 crc kubenswrapper[4721]: I0202 14:28:54.628468 4721 generic.go:334] "Generic (PLEG): container finished" podID="b2305307-fc37-4522-abb5-6dc428e94e61" containerID="dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4" exitCode=0 Feb 02 14:28:54 crc kubenswrapper[4721]: I0202 14:28:54.628510 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrptg" event={"ID":"b2305307-fc37-4522-abb5-6dc428e94e61","Type":"ContainerDied","Data":"dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4"} Feb 02 14:28:55 crc kubenswrapper[4721]: I0202 14:28:55.643136 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6zb9" event={"ID":"aa73e29e-aeec-4257-abab-cc99e8e99afa","Type":"ContainerStarted","Data":"ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f"} Feb 02 14:28:55 crc kubenswrapper[4721]: I0202 14:28:55.650029 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrptg" event={"ID":"b2305307-fc37-4522-abb5-6dc428e94e61","Type":"ContainerStarted","Data":"99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8"} Feb 02 14:28:55 crc kubenswrapper[4721]: I0202 14:28:55.675012 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x6zb9" podStartSLOduration=4.095566349 podStartE2EDuration="7.674982266s" podCreationTimestamp="2026-02-02 14:28:48 +0000 UTC" firstStartedPulling="2026-02-02 14:28:51.572507711 +0000 UTC m=+5271.875022120" lastFinishedPulling="2026-02-02 14:28:55.151923648 +0000 UTC m=+5275.454438037" observedRunningTime="2026-02-02 14:28:55.663731982 +0000 UTC m=+5275.966246381" watchObservedRunningTime="2026-02-02 14:28:55.674982266 +0000 UTC m=+5275.977496655" Feb 02 14:28:55 crc kubenswrapper[4721]: I0202 14:28:55.695489 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jrptg" podStartSLOduration=4.024291394 podStartE2EDuration="7.695451249s" podCreationTimestamp="2026-02-02 14:28:48 +0000 UTC" firstStartedPulling="2026-02-02 14:28:51.568980296 +0000 UTC m=+5271.871494685" lastFinishedPulling="2026-02-02 14:28:55.240140151 +0000 UTC m=+5275.542654540" observedRunningTime="2026-02-02 14:28:55.690219168 +0000 UTC m=+5275.992733587" watchObservedRunningTime="2026-02-02 14:28:55.695451249 +0000 UTC m=+5275.997965638" Feb 02 14:28:58 crc kubenswrapper[4721]: I0202 14:28:58.892907 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:58 crc kubenswrapper[4721]: I0202 14:28:58.894912 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:58 crc kubenswrapper[4721]: I0202 14:28:58.954546 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:28:59 crc kubenswrapper[4721]: I0202 14:28:59.156708 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:59 crc kubenswrapper[4721]: I0202 14:28:59.156773 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:28:59 crc kubenswrapper[4721]: I0202 14:28:59.209142 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:29:00 crc kubenswrapper[4721]: I0202 14:29:00.709565 4721 generic.go:334] "Generic (PLEG): container finished" podID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerID="bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9" exitCode=0 Feb 02 14:29:00 crc kubenswrapper[4721]: I0202 14:29:00.711816 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqkjm" event={"ID":"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6","Type":"ContainerDied","Data":"bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9"} Feb 02 14:29:01 crc kubenswrapper[4721]: I0202 14:29:01.173401 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:29:02 crc kubenswrapper[4721]: I0202 14:29:02.738758 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqkjm" event={"ID":"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6","Type":"ContainerStarted","Data":"2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166"} Feb 02 14:29:02 crc kubenswrapper[4721]: I0202 14:29:02.765092 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gqkjm" podStartSLOduration=2.622837155 podStartE2EDuration="11.765055463s" podCreationTimestamp="2026-02-02 14:28:51 +0000 UTC" firstStartedPulling="2026-02-02 14:28:52.589836621 +0000 UTC m=+5272.892351010" lastFinishedPulling="2026-02-02 14:29:01.732054929 +0000 UTC m=+5282.034569318" observedRunningTime="2026-02-02 14:29:02.764007644 +0000 UTC m=+5283.066522033" watchObservedRunningTime="2026-02-02 14:29:02.765055463 +0000 UTC m=+5283.067569862" Feb 02 14:29:03 crc kubenswrapper[4721]: I0202 14:29:03.755500 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6zb9"] Feb 02 14:29:03 crc kubenswrapper[4721]: I0202 14:29:03.756292 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x6zb9" podUID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerName="registry-server" containerID="cri-o://ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f" gracePeriod=2 Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.428470 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.507206 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfvjj\" (UniqueName: \"kubernetes.io/projected/aa73e29e-aeec-4257-abab-cc99e8e99afa-kube-api-access-tfvjj\") pod \"aa73e29e-aeec-4257-abab-cc99e8e99afa\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.508888 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-catalog-content\") pod \"aa73e29e-aeec-4257-abab-cc99e8e99afa\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.509031 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-utilities\") pod \"aa73e29e-aeec-4257-abab-cc99e8e99afa\" (UID: \"aa73e29e-aeec-4257-abab-cc99e8e99afa\") " Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.509932 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-utilities" (OuterVolumeSpecName: "utilities") pod "aa73e29e-aeec-4257-abab-cc99e8e99afa" (UID: "aa73e29e-aeec-4257-abab-cc99e8e99afa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.510526 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.515398 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa73e29e-aeec-4257-abab-cc99e8e99afa-kube-api-access-tfvjj" (OuterVolumeSpecName: "kube-api-access-tfvjj") pod "aa73e29e-aeec-4257-abab-cc99e8e99afa" (UID: "aa73e29e-aeec-4257-abab-cc99e8e99afa"). InnerVolumeSpecName "kube-api-access-tfvjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.561710 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa73e29e-aeec-4257-abab-cc99e8e99afa" (UID: "aa73e29e-aeec-4257-abab-cc99e8e99afa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.613343 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfvjj\" (UniqueName: \"kubernetes.io/projected/aa73e29e-aeec-4257-abab-cc99e8e99afa-kube-api-access-tfvjj\") on node \"crc\" DevicePath \"\"" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.613383 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa73e29e-aeec-4257-abab-cc99e8e99afa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.776132 4721 generic.go:334] "Generic (PLEG): container finished" podID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerID="ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f" exitCode=0 Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.776538 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6zb9" event={"ID":"aa73e29e-aeec-4257-abab-cc99e8e99afa","Type":"ContainerDied","Data":"ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f"} Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.776580 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6zb9" event={"ID":"aa73e29e-aeec-4257-abab-cc99e8e99afa","Type":"ContainerDied","Data":"c5d72c4cb5231aaa48bd8c849bdb7a55ce6b86d4b2df1ba54009fff2f035fd3c"} Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.776608 4721 scope.go:117] "RemoveContainer" containerID="ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.776828 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6zb9" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.810621 4721 scope.go:117] "RemoveContainer" containerID="c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.825985 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6zb9"] Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.840862 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x6zb9"] Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.841388 4721 scope.go:117] "RemoveContainer" containerID="2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.899578 4721 scope.go:117] "RemoveContainer" containerID="ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f" Feb 02 14:29:04 crc kubenswrapper[4721]: E0202 14:29:04.900041 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f\": container with ID starting with ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f not found: ID does not exist" containerID="ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.900091 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f"} err="failed to get container status \"ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f\": rpc error: code = NotFound desc = could not find container \"ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f\": container with ID starting with ebbf4cace5aefcc85a77e7f9064057583bd5cbf813e2854bd7ad81896b7b391f not found: ID does not exist" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.900123 4721 scope.go:117] "RemoveContainer" containerID="c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b" Feb 02 14:29:04 crc kubenswrapper[4721]: E0202 14:29:04.900460 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b\": container with ID starting with c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b not found: ID does not exist" containerID="c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.900489 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b"} err="failed to get container status \"c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b\": rpc error: code = NotFound desc = could not find container \"c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b\": container with ID starting with c92eaf1ac3e8622070f78fbfd4bc88dd8c6eebf96d764ee8dc4c9bce9c8c2f4b not found: ID does not exist" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.900504 4721 scope.go:117] "RemoveContainer" containerID="2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5" Feb 02 14:29:04 crc kubenswrapper[4721]: E0202 14:29:04.900716 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5\": container with ID starting with 2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5 not found: ID does not exist" containerID="2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5" Feb 02 14:29:04 crc kubenswrapper[4721]: I0202 14:29:04.900744 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5"} err="failed to get container status \"2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5\": rpc error: code = NotFound desc = could not find container \"2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5\": container with ID starting with 2ae243d246dc83d8f161289ab6eb6c82efad98a9f4b7e1771cf1c0f294f7fcc5 not found: ID does not exist" Feb 02 14:29:06 crc kubenswrapper[4721]: I0202 14:29:06.423666 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa73e29e-aeec-4257-abab-cc99e8e99afa" path="/var/lib/kubelet/pods/aa73e29e-aeec-4257-abab-cc99e8e99afa/volumes" Feb 02 14:29:09 crc kubenswrapper[4721]: I0202 14:29:09.207345 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:29:09 crc kubenswrapper[4721]: I0202 14:29:09.274123 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jrptg"] Feb 02 14:29:09 crc kubenswrapper[4721]: I0202 14:29:09.829866 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jrptg" podUID="b2305307-fc37-4522-abb5-6dc428e94e61" containerName="registry-server" containerID="cri-o://99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8" gracePeriod=2 Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.348809 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.485581 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-catalog-content\") pod \"b2305307-fc37-4522-abb5-6dc428e94e61\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.485740 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-utilities\") pod \"b2305307-fc37-4522-abb5-6dc428e94e61\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.485849 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzcxd\" (UniqueName: \"kubernetes.io/projected/b2305307-fc37-4522-abb5-6dc428e94e61-kube-api-access-lzcxd\") pod \"b2305307-fc37-4522-abb5-6dc428e94e61\" (UID: \"b2305307-fc37-4522-abb5-6dc428e94e61\") " Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.488124 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-utilities" (OuterVolumeSpecName: "utilities") pod "b2305307-fc37-4522-abb5-6dc428e94e61" (UID: "b2305307-fc37-4522-abb5-6dc428e94e61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.501468 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2305307-fc37-4522-abb5-6dc428e94e61-kube-api-access-lzcxd" (OuterVolumeSpecName: "kube-api-access-lzcxd") pod "b2305307-fc37-4522-abb5-6dc428e94e61" (UID: "b2305307-fc37-4522-abb5-6dc428e94e61"). InnerVolumeSpecName "kube-api-access-lzcxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.559225 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2305307-fc37-4522-abb5-6dc428e94e61" (UID: "b2305307-fc37-4522-abb5-6dc428e94e61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.590215 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.590240 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2305307-fc37-4522-abb5-6dc428e94e61-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.590250 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzcxd\" (UniqueName: \"kubernetes.io/projected/b2305307-fc37-4522-abb5-6dc428e94e61-kube-api-access-lzcxd\") on node \"crc\" DevicePath \"\"" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.842866 4721 generic.go:334] "Generic (PLEG): container finished" podID="b2305307-fc37-4522-abb5-6dc428e94e61" containerID="99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8" exitCode=0 Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.842931 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrptg" event={"ID":"b2305307-fc37-4522-abb5-6dc428e94e61","Type":"ContainerDied","Data":"99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8"} Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.842968 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrptg" event={"ID":"b2305307-fc37-4522-abb5-6dc428e94e61","Type":"ContainerDied","Data":"037261c6a323f6dbd716adcfb2e228053f7afc8794d4b86219c39f4889d652e7"} Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.842987 4721 scope.go:117] "RemoveContainer" containerID="99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.843289 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jrptg" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.870348 4721 scope.go:117] "RemoveContainer" containerID="dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.902276 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jrptg"] Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.912875 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jrptg"] Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.930611 4721 scope.go:117] "RemoveContainer" containerID="45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.971334 4721 scope.go:117] "RemoveContainer" containerID="99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8" Feb 02 14:29:10 crc kubenswrapper[4721]: E0202 14:29:10.972324 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8\": container with ID starting with 99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8 not found: ID does not exist" containerID="99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.972382 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8"} err="failed to get container status \"99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8\": rpc error: code = NotFound desc = could not find container \"99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8\": container with ID starting with 99be6af392b49413d48b4bcac1fa9431ca1a8e1ac8889b73f875ce9d9eb15be8 not found: ID does not exist" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.972419 4721 scope.go:117] "RemoveContainer" containerID="dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4" Feb 02 14:29:10 crc kubenswrapper[4721]: E0202 14:29:10.977714 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4\": container with ID starting with dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4 not found: ID does not exist" containerID="dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.977771 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4"} err="failed to get container status \"dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4\": rpc error: code = NotFound desc = could not find container \"dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4\": container with ID starting with dd3627080e2dde3280df7f039683d58fe97f48d5c559afaab34d198530a21aa4 not found: ID does not exist" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.977809 4721 scope.go:117] "RemoveContainer" containerID="45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477" Feb 02 14:29:10 crc kubenswrapper[4721]: E0202 14:29:10.982649 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477\": container with ID starting with 45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477 not found: ID does not exist" containerID="45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477" Feb 02 14:29:10 crc kubenswrapper[4721]: I0202 14:29:10.982718 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477"} err="failed to get container status \"45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477\": rpc error: code = NotFound desc = could not find container \"45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477\": container with ID starting with 45f2eea2c429dbc41d6c0c9c84d8c1d4337067337a0e5187b2ddf2ea0476a477 not found: ID does not exist" Feb 02 14:29:11 crc kubenswrapper[4721]: I0202 14:29:11.512217 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:29:11 crc kubenswrapper[4721]: I0202 14:29:11.512395 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:29:12 crc kubenswrapper[4721]: I0202 14:29:12.432217 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2305307-fc37-4522-abb5-6dc428e94e61" path="/var/lib/kubelet/pods/b2305307-fc37-4522-abb5-6dc428e94e61/volumes" Feb 02 14:29:12 crc kubenswrapper[4721]: I0202 14:29:12.563244 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gqkjm" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="registry-server" probeResult="failure" output=< Feb 02 14:29:12 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:29:12 crc kubenswrapper[4721]: > Feb 02 14:29:14 crc kubenswrapper[4721]: I0202 14:29:14.763694 4721 patch_prober.go:28] interesting pod/machine-config-daemon-rppjz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 14:29:14 crc kubenswrapper[4721]: I0202 14:29:14.764114 4721 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 14:29:14 crc kubenswrapper[4721]: I0202 14:29:14.764167 4721 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" Feb 02 14:29:14 crc kubenswrapper[4721]: I0202 14:29:14.765228 4721 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a"} pod="openshift-machine-config-operator/machine-config-daemon-rppjz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 14:29:14 crc kubenswrapper[4721]: I0202 14:29:14.765291 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerName="machine-config-daemon" containerID="cri-o://12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" gracePeriod=600 Feb 02 14:29:14 crc kubenswrapper[4721]: E0202 14:29:14.896822 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:29:15 crc kubenswrapper[4721]: I0202 14:29:15.350686 4721 scope.go:117] "RemoveContainer" containerID="3d42ccd527c1c49d2ecc79f70ee1abbc38a002f9404fa0303071492300851127" Feb 02 14:29:15 crc kubenswrapper[4721]: I0202 14:29:15.914738 4721 generic.go:334] "Generic (PLEG): container finished" podID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" exitCode=0 Feb 02 14:29:15 crc kubenswrapper[4721]: I0202 14:29:15.914810 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerDied","Data":"12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a"} Feb 02 14:29:15 crc kubenswrapper[4721]: I0202 14:29:15.915138 4721 scope.go:117] "RemoveContainer" containerID="80a4d97025e59da4be229a82e47246c7035ccedac52df0fa5679fdc8c1c7b8ec" Feb 02 14:29:15 crc kubenswrapper[4721]: I0202 14:29:15.915994 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:29:15 crc kubenswrapper[4721]: E0202 14:29:15.916503 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:29:22 crc kubenswrapper[4721]: I0202 14:29:22.565496 4721 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gqkjm" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="registry-server" probeResult="failure" output=< Feb 02 14:29:22 crc kubenswrapper[4721]: timeout: failed to connect service ":50051" within 1s Feb 02 14:29:22 crc kubenswrapper[4721]: > Feb 02 14:29:29 crc kubenswrapper[4721]: I0202 14:29:29.410120 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:29:29 crc kubenswrapper[4721]: E0202 14:29:29.410780 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:29:31 crc kubenswrapper[4721]: I0202 14:29:31.571568 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:29:31 crc kubenswrapper[4721]: I0202 14:29:31.631160 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:29:31 crc kubenswrapper[4721]: I0202 14:29:31.819217 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqkjm"] Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.102556 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gqkjm" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="registry-server" containerID="cri-o://2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166" gracePeriod=2 Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.669905 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.774130 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdrfd\" (UniqueName: \"kubernetes.io/projected/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-kube-api-access-mdrfd\") pod \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.774197 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-catalog-content\") pod \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.774400 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-utilities\") pod \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\" (UID: \"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6\") " Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.776449 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-utilities" (OuterVolumeSpecName: "utilities") pod "ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" (UID: "ef3b13ac-f3ee-4b4b-a8eb-365a926869e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.781326 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-kube-api-access-mdrfd" (OuterVolumeSpecName: "kube-api-access-mdrfd") pod "ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" (UID: "ef3b13ac-f3ee-4b4b-a8eb-365a926869e6"). InnerVolumeSpecName "kube-api-access-mdrfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.876849 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.877189 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdrfd\" (UniqueName: \"kubernetes.io/projected/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-kube-api-access-mdrfd\") on node \"crc\" DevicePath \"\"" Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.898089 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" (UID: "ef3b13ac-f3ee-4b4b-a8eb-365a926869e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:29:33 crc kubenswrapper[4721]: I0202 14:29:33.978995 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.130776 4721 generic.go:334] "Generic (PLEG): container finished" podID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerID="2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166" exitCode=0 Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.130824 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqkjm" event={"ID":"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6","Type":"ContainerDied","Data":"2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166"} Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.130856 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqkjm" event={"ID":"ef3b13ac-f3ee-4b4b-a8eb-365a926869e6","Type":"ContainerDied","Data":"15ed5f7bb533c11c83fb5d5905a7a20f12acc04f97e988e956205502593356c9"} Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.130877 4721 scope.go:117] "RemoveContainer" containerID="2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.131061 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqkjm" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.159229 4721 scope.go:117] "RemoveContainer" containerID="bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.178458 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqkjm"] Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.189030 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gqkjm"] Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.190780 4721 scope.go:117] "RemoveContainer" containerID="f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.258662 4721 scope.go:117] "RemoveContainer" containerID="2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166" Feb 02 14:29:34 crc kubenswrapper[4721]: E0202 14:29:34.259176 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166\": container with ID starting with 2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166 not found: ID does not exist" containerID="2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.259216 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166"} err="failed to get container status \"2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166\": rpc error: code = NotFound desc = could not find container \"2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166\": container with ID starting with 2c5f9e0314e57b27f735487f6e47e6d06478739d292a9ee40102436d47095166 not found: ID does not exist" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.259249 4721 scope.go:117] "RemoveContainer" containerID="bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9" Feb 02 14:29:34 crc kubenswrapper[4721]: E0202 14:29:34.259587 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9\": container with ID starting with bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9 not found: ID does not exist" containerID="bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.259634 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9"} err="failed to get container status \"bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9\": rpc error: code = NotFound desc = could not find container \"bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9\": container with ID starting with bb504c21a6cea0a7496dbfbf4e6c97a51631619cc916f155c9b3788feb3c7df9 not found: ID does not exist" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.259662 4721 scope.go:117] "RemoveContainer" containerID="f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1" Feb 02 14:29:34 crc kubenswrapper[4721]: E0202 14:29:34.259887 4721 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1\": container with ID starting with f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1 not found: ID does not exist" containerID="f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.259908 4721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1"} err="failed to get container status \"f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1\": rpc error: code = NotFound desc = could not find container \"f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1\": container with ID starting with f1979fe9a92ae0c671cb466e891c9042ba98ed5b8fb336d632a9e9d82d0813c1 not found: ID does not exist" Feb 02 14:29:34 crc kubenswrapper[4721]: I0202 14:29:34.428013 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" path="/var/lib/kubelet/pods/ef3b13ac-f3ee-4b4b-a8eb-365a926869e6/volumes" Feb 02 14:29:40 crc kubenswrapper[4721]: I0202 14:29:40.417104 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:29:40 crc kubenswrapper[4721]: E0202 14:29:40.417983 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:29:49 crc kubenswrapper[4721]: I0202 14:29:49.332562 4721 generic.go:334] "Generic (PLEG): container finished" podID="56b96222-739f-41d2-996e-14e2ee91a139" containerID="ee046980151a1ce898253bd863e95503c1103f7eed2587fbfa57c95be0495a2a" exitCode=0 Feb 02 14:29:49 crc kubenswrapper[4721]: I0202 14:29:49.332638 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" event={"ID":"56b96222-739f-41d2-996e-14e2ee91a139","Type":"ContainerDied","Data":"ee046980151a1ce898253bd863e95503c1103f7eed2587fbfa57c95be0495a2a"} Feb 02 14:29:49 crc kubenswrapper[4721]: I0202 14:29:49.333989 4721 scope.go:117] "RemoveContainer" containerID="ee046980151a1ce898253bd863e95503c1103f7eed2587fbfa57c95be0495a2a" Feb 02 14:29:50 crc kubenswrapper[4721]: I0202 14:29:50.241606 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2dxv_must-gather-kr4jl_56b96222-739f-41d2-996e-14e2ee91a139/gather/0.log" Feb 02 14:29:54 crc kubenswrapper[4721]: I0202 14:29:54.409986 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:29:54 crc kubenswrapper[4721]: E0202 14:29:54.410767 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:29:58 crc kubenswrapper[4721]: I0202 14:29:58.981825 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d2dxv/must-gather-kr4jl"] Feb 02 14:29:58 crc kubenswrapper[4721]: I0202 14:29:58.982705 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" podUID="56b96222-739f-41d2-996e-14e2ee91a139" containerName="copy" containerID="cri-o://79dee8f469c998ce3c1779dff4af47dcba83618a5b09996aa6983b70d3d26f3f" gracePeriod=2 Feb 02 14:29:58 crc kubenswrapper[4721]: I0202 14:29:58.991918 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d2dxv/must-gather-kr4jl"] Feb 02 14:29:59 crc kubenswrapper[4721]: I0202 14:29:59.449798 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2dxv_must-gather-kr4jl_56b96222-739f-41d2-996e-14e2ee91a139/copy/0.log" Feb 02 14:29:59 crc kubenswrapper[4721]: I0202 14:29:59.451573 4721 generic.go:334] "Generic (PLEG): container finished" podID="56b96222-739f-41d2-996e-14e2ee91a139" containerID="79dee8f469c998ce3c1779dff4af47dcba83618a5b09996aa6983b70d3d26f3f" exitCode=143 Feb 02 14:29:59 crc kubenswrapper[4721]: I0202 14:29:59.912537 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2dxv_must-gather-kr4jl_56b96222-739f-41d2-996e-14e2ee91a139/copy/0.log" Feb 02 14:29:59 crc kubenswrapper[4721]: I0202 14:29:59.913403 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:29:59 crc kubenswrapper[4721]: I0202 14:29:59.931411 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/56b96222-739f-41d2-996e-14e2ee91a139-must-gather-output\") pod \"56b96222-739f-41d2-996e-14e2ee91a139\" (UID: \"56b96222-739f-41d2-996e-14e2ee91a139\") " Feb 02 14:29:59 crc kubenswrapper[4721]: I0202 14:29:59.931707 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v576\" (UniqueName: \"kubernetes.io/projected/56b96222-739f-41d2-996e-14e2ee91a139-kube-api-access-9v576\") pod \"56b96222-739f-41d2-996e-14e2ee91a139\" (UID: \"56b96222-739f-41d2-996e-14e2ee91a139\") " Feb 02 14:29:59 crc kubenswrapper[4721]: I0202 14:29:59.939410 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b96222-739f-41d2-996e-14e2ee91a139-kube-api-access-9v576" (OuterVolumeSpecName: "kube-api-access-9v576") pod "56b96222-739f-41d2-996e-14e2ee91a139" (UID: "56b96222-739f-41d2-996e-14e2ee91a139"). InnerVolumeSpecName "kube-api-access-9v576". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.033565 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v576\" (UniqueName: \"kubernetes.io/projected/56b96222-739f-41d2-996e-14e2ee91a139-kube-api-access-9v576\") on node \"crc\" DevicePath \"\"" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.087420 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56b96222-739f-41d2-996e-14e2ee91a139-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "56b96222-739f-41d2-996e-14e2ee91a139" (UID: "56b96222-739f-41d2-996e-14e2ee91a139"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.136026 4721 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/56b96222-739f-41d2-996e-14e2ee91a139-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.163812 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh"] Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164740 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="extract-content" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164763 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="extract-content" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164784 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2305307-fc37-4522-abb5-6dc428e94e61" containerName="extract-content" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164794 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2305307-fc37-4522-abb5-6dc428e94e61" containerName="extract-content" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164826 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="registry-server" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164834 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="registry-server" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164850 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerName="registry-server" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164857 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerName="registry-server" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164866 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2305307-fc37-4522-abb5-6dc428e94e61" containerName="extract-utilities" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164873 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2305307-fc37-4522-abb5-6dc428e94e61" containerName="extract-utilities" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164891 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b96222-739f-41d2-996e-14e2ee91a139" containerName="gather" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164899 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b96222-739f-41d2-996e-14e2ee91a139" containerName="gather" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164910 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="extract-utilities" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164918 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="extract-utilities" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164936 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerName="extract-utilities" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164944 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerName="extract-utilities" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164957 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerName="extract-content" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164963 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerName="extract-content" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164972 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2305307-fc37-4522-abb5-6dc428e94e61" containerName="registry-server" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.164979 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2305307-fc37-4522-abb5-6dc428e94e61" containerName="registry-server" Feb 02 14:30:00 crc kubenswrapper[4721]: E0202 14:30:00.164994 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b96222-739f-41d2-996e-14e2ee91a139" containerName="copy" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.165001 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b96222-739f-41d2-996e-14e2ee91a139" containerName="copy" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.165304 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2305307-fc37-4522-abb5-6dc428e94e61" containerName="registry-server" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.165330 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b96222-739f-41d2-996e-14e2ee91a139" containerName="copy" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.165355 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b96222-739f-41d2-996e-14e2ee91a139" containerName="gather" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.165366 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa73e29e-aeec-4257-abab-cc99e8e99afa" containerName="registry-server" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.165381 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef3b13ac-f3ee-4b4b-a8eb-365a926869e6" containerName="registry-server" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.166445 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.169759 4721 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.171429 4721 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.179016 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh"] Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.240480 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxl4k\" (UniqueName: \"kubernetes.io/projected/e74898b2-80a7-41b4-b44d-e6fcf2409d49-kube-api-access-zxl4k\") pod \"collect-profiles-29500710-st8kh\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.240697 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e74898b2-80a7-41b4-b44d-e6fcf2409d49-secret-volume\") pod \"collect-profiles-29500710-st8kh\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.241055 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e74898b2-80a7-41b4-b44d-e6fcf2409d49-config-volume\") pod \"collect-profiles-29500710-st8kh\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.343446 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e74898b2-80a7-41b4-b44d-e6fcf2409d49-config-volume\") pod \"collect-profiles-29500710-st8kh\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.343546 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxl4k\" (UniqueName: \"kubernetes.io/projected/e74898b2-80a7-41b4-b44d-e6fcf2409d49-kube-api-access-zxl4k\") pod \"collect-profiles-29500710-st8kh\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.343682 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e74898b2-80a7-41b4-b44d-e6fcf2409d49-secret-volume\") pod \"collect-profiles-29500710-st8kh\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.344497 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e74898b2-80a7-41b4-b44d-e6fcf2409d49-config-volume\") pod \"collect-profiles-29500710-st8kh\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.347930 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e74898b2-80a7-41b4-b44d-e6fcf2409d49-secret-volume\") pod \"collect-profiles-29500710-st8kh\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.363019 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxl4k\" (UniqueName: \"kubernetes.io/projected/e74898b2-80a7-41b4-b44d-e6fcf2409d49-kube-api-access-zxl4k\") pod \"collect-profiles-29500710-st8kh\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.432739 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56b96222-739f-41d2-996e-14e2ee91a139" path="/var/lib/kubelet/pods/56b96222-739f-41d2-996e-14e2ee91a139/volumes" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.478750 4721 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d2dxv_must-gather-kr4jl_56b96222-739f-41d2-996e-14e2ee91a139/copy/0.log" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.479145 4721 scope.go:117] "RemoveContainer" containerID="79dee8f469c998ce3c1779dff4af47dcba83618a5b09996aa6983b70d3d26f3f" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.479266 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2dxv/must-gather-kr4jl" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.486690 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:00 crc kubenswrapper[4721]: I0202 14:30:00.968856 4721 scope.go:117] "RemoveContainer" containerID="ee046980151a1ce898253bd863e95503c1103f7eed2587fbfa57c95be0495a2a" Feb 02 14:30:01 crc kubenswrapper[4721]: I0202 14:30:01.532710 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh"] Feb 02 14:30:02 crc kubenswrapper[4721]: I0202 14:30:02.506455 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" event={"ID":"e74898b2-80a7-41b4-b44d-e6fcf2409d49","Type":"ContainerStarted","Data":"14e5b815570fe82ab155ac70919cc247b7c3fef1e80c74818febeab15ae1519b"} Feb 02 14:30:02 crc kubenswrapper[4721]: I0202 14:30:02.507154 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" event={"ID":"e74898b2-80a7-41b4-b44d-e6fcf2409d49","Type":"ContainerStarted","Data":"4e45108cd7d7e75c5966932e75398503072a8a03e8e81731d9289ce2c1b8cdd3"} Feb 02 14:30:02 crc kubenswrapper[4721]: I0202 14:30:02.529703 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" podStartSLOduration=2.529678958 podStartE2EDuration="2.529678958s" podCreationTimestamp="2026-02-02 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 14:30:02.520914411 +0000 UTC m=+5342.823428820" watchObservedRunningTime="2026-02-02 14:30:02.529678958 +0000 UTC m=+5342.832193357" Feb 02 14:30:03 crc kubenswrapper[4721]: I0202 14:30:03.517657 4721 generic.go:334] "Generic (PLEG): container finished" podID="e74898b2-80a7-41b4-b44d-e6fcf2409d49" containerID="14e5b815570fe82ab155ac70919cc247b7c3fef1e80c74818febeab15ae1519b" exitCode=0 Feb 02 14:30:03 crc kubenswrapper[4721]: I0202 14:30:03.517755 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" event={"ID":"e74898b2-80a7-41b4-b44d-e6fcf2409d49","Type":"ContainerDied","Data":"14e5b815570fe82ab155ac70919cc247b7c3fef1e80c74818febeab15ae1519b"} Feb 02 14:30:04 crc kubenswrapper[4721]: I0202 14:30:04.947192 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.004925 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxl4k\" (UniqueName: \"kubernetes.io/projected/e74898b2-80a7-41b4-b44d-e6fcf2409d49-kube-api-access-zxl4k\") pod \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.004980 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e74898b2-80a7-41b4-b44d-e6fcf2409d49-config-volume\") pod \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.005111 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e74898b2-80a7-41b4-b44d-e6fcf2409d49-secret-volume\") pod \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\" (UID: \"e74898b2-80a7-41b4-b44d-e6fcf2409d49\") " Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.006108 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74898b2-80a7-41b4-b44d-e6fcf2409d49-config-volume" (OuterVolumeSpecName: "config-volume") pod "e74898b2-80a7-41b4-b44d-e6fcf2409d49" (UID: "e74898b2-80a7-41b4-b44d-e6fcf2409d49"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.033760 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e74898b2-80a7-41b4-b44d-e6fcf2409d49-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e74898b2-80a7-41b4-b44d-e6fcf2409d49" (UID: "e74898b2-80a7-41b4-b44d-e6fcf2409d49"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.034154 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e74898b2-80a7-41b4-b44d-e6fcf2409d49-kube-api-access-zxl4k" (OuterVolumeSpecName: "kube-api-access-zxl4k") pod "e74898b2-80a7-41b4-b44d-e6fcf2409d49" (UID: "e74898b2-80a7-41b4-b44d-e6fcf2409d49"). InnerVolumeSpecName "kube-api-access-zxl4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.108895 4721 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e74898b2-80a7-41b4-b44d-e6fcf2409d49-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.108932 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxl4k\" (UniqueName: \"kubernetes.io/projected/e74898b2-80a7-41b4-b44d-e6fcf2409d49-kube-api-access-zxl4k\") on node \"crc\" DevicePath \"\"" Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.108945 4721 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e74898b2-80a7-41b4-b44d-e6fcf2409d49-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.536214 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" event={"ID":"e74898b2-80a7-41b4-b44d-e6fcf2409d49","Type":"ContainerDied","Data":"4e45108cd7d7e75c5966932e75398503072a8a03e8e81731d9289ce2c1b8cdd3"} Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.536805 4721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e45108cd7d7e75c5966932e75398503072a8a03e8e81731d9289ce2c1b8cdd3" Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.536250 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500710-st8kh" Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.593636 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh"] Feb 02 14:30:05 crc kubenswrapper[4721]: I0202 14:30:05.604946 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500665-gnnxh"] Feb 02 14:30:06 crc kubenswrapper[4721]: I0202 14:30:06.423297 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c1f0398-e18b-44f0-b0a8-21f2de8af4d0" path="/var/lib/kubelet/pods/0c1f0398-e18b-44f0-b0a8-21f2de8af4d0/volumes" Feb 02 14:30:07 crc kubenswrapper[4721]: I0202 14:30:07.416664 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:30:07 crc kubenswrapper[4721]: E0202 14:30:07.417264 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:30:15 crc kubenswrapper[4721]: I0202 14:30:15.840374 4721 scope.go:117] "RemoveContainer" containerID="65b28cee5d7aba79bd5fdc801054328d6006f1cbcc896f6fdd692cc1c3bf2690" Feb 02 14:30:19 crc kubenswrapper[4721]: I0202 14:30:19.410044 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:30:19 crc kubenswrapper[4721]: E0202 14:30:19.410853 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.132774 4721 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qkd8k"] Feb 02 14:30:33 crc kubenswrapper[4721]: E0202 14:30:33.134311 4721 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74898b2-80a7-41b4-b44d-e6fcf2409d49" containerName="collect-profiles" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.134326 4721 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74898b2-80a7-41b4-b44d-e6fcf2409d49" containerName="collect-profiles" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.134580 4721 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74898b2-80a7-41b4-b44d-e6fcf2409d49" containerName="collect-profiles" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.136713 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.148931 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkd8k"] Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.205775 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-catalog-content\") pod \"redhat-marketplace-qkd8k\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.205847 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svlz6\" (UniqueName: \"kubernetes.io/projected/3688a7b5-c948-4eab-8be8-2206c13a2af4-kube-api-access-svlz6\") pod \"redhat-marketplace-qkd8k\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.205966 4721 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-utilities\") pod \"redhat-marketplace-qkd8k\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.319128 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-catalog-content\") pod \"redhat-marketplace-qkd8k\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.319473 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svlz6\" (UniqueName: \"kubernetes.io/projected/3688a7b5-c948-4eab-8be8-2206c13a2af4-kube-api-access-svlz6\") pod \"redhat-marketplace-qkd8k\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.319696 4721 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-utilities\") pod \"redhat-marketplace-qkd8k\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.320653 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-utilities\") pod \"redhat-marketplace-qkd8k\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.321549 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-catalog-content\") pod \"redhat-marketplace-qkd8k\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.469716 4721 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svlz6\" (UniqueName: \"kubernetes.io/projected/3688a7b5-c948-4eab-8be8-2206c13a2af4-kube-api-access-svlz6\") pod \"redhat-marketplace-qkd8k\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:33 crc kubenswrapper[4721]: I0202 14:30:33.764305 4721 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:34 crc kubenswrapper[4721]: I0202 14:30:34.261300 4721 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkd8k"] Feb 02 14:30:34 crc kubenswrapper[4721]: I0202 14:30:34.409244 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:30:34 crc kubenswrapper[4721]: E0202 14:30:34.409533 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:30:34 crc kubenswrapper[4721]: I0202 14:30:34.844084 4721 generic.go:334] "Generic (PLEG): container finished" podID="3688a7b5-c948-4eab-8be8-2206c13a2af4" containerID="bdd9f70acfc914613f4ccaf5ae253dff448e5366e8d44d9c8f70f0811a2239c0" exitCode=0 Feb 02 14:30:34 crc kubenswrapper[4721]: I0202 14:30:34.844202 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkd8k" event={"ID":"3688a7b5-c948-4eab-8be8-2206c13a2af4","Type":"ContainerDied","Data":"bdd9f70acfc914613f4ccaf5ae253dff448e5366e8d44d9c8f70f0811a2239c0"} Feb 02 14:30:34 crc kubenswrapper[4721]: I0202 14:30:34.844500 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkd8k" event={"ID":"3688a7b5-c948-4eab-8be8-2206c13a2af4","Type":"ContainerStarted","Data":"c375a82bdfcb0486ffd2febfb8cf261cb67c6154240c2840964db47a65af8885"} Feb 02 14:30:36 crc kubenswrapper[4721]: I0202 14:30:36.870060 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkd8k" event={"ID":"3688a7b5-c948-4eab-8be8-2206c13a2af4","Type":"ContainerStarted","Data":"f84d65de81662ae82217c9ed34d813c8add0279b72245d302e8ae2471cca8b71"} Feb 02 14:30:37 crc kubenswrapper[4721]: I0202 14:30:37.884968 4721 generic.go:334] "Generic (PLEG): container finished" podID="3688a7b5-c948-4eab-8be8-2206c13a2af4" containerID="f84d65de81662ae82217c9ed34d813c8add0279b72245d302e8ae2471cca8b71" exitCode=0 Feb 02 14:30:37 crc kubenswrapper[4721]: I0202 14:30:37.885095 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkd8k" event={"ID":"3688a7b5-c948-4eab-8be8-2206c13a2af4","Type":"ContainerDied","Data":"f84d65de81662ae82217c9ed34d813c8add0279b72245d302e8ae2471cca8b71"} Feb 02 14:30:38 crc kubenswrapper[4721]: I0202 14:30:38.898145 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkd8k" event={"ID":"3688a7b5-c948-4eab-8be8-2206c13a2af4","Type":"ContainerStarted","Data":"e4f07d6b243c486bb07b9ba79db513a67c0f430b043761f4c17f0a1c063e001d"} Feb 02 14:30:38 crc kubenswrapper[4721]: I0202 14:30:38.928685 4721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qkd8k" podStartSLOduration=2.475175078 podStartE2EDuration="5.928659032s" podCreationTimestamp="2026-02-02 14:30:33 +0000 UTC" firstStartedPulling="2026-02-02 14:30:34.847506863 +0000 UTC m=+5375.150021272" lastFinishedPulling="2026-02-02 14:30:38.300990837 +0000 UTC m=+5378.603505226" observedRunningTime="2026-02-02 14:30:38.919148405 +0000 UTC m=+5379.221662814" watchObservedRunningTime="2026-02-02 14:30:38.928659032 +0000 UTC m=+5379.231173441" Feb 02 14:30:43 crc kubenswrapper[4721]: I0202 14:30:43.764767 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:43 crc kubenswrapper[4721]: I0202 14:30:43.765456 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:43 crc kubenswrapper[4721]: I0202 14:30:43.812712 4721 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:43 crc kubenswrapper[4721]: I0202 14:30:43.992436 4721 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:44 crc kubenswrapper[4721]: I0202 14:30:44.060934 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkd8k"] Feb 02 14:30:45 crc kubenswrapper[4721]: I0202 14:30:45.965346 4721 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qkd8k" podUID="3688a7b5-c948-4eab-8be8-2206c13a2af4" containerName="registry-server" containerID="cri-o://e4f07d6b243c486bb07b9ba79db513a67c0f430b043761f4c17f0a1c063e001d" gracePeriod=2 Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.004228 4721 generic.go:334] "Generic (PLEG): container finished" podID="3688a7b5-c948-4eab-8be8-2206c13a2af4" containerID="e4f07d6b243c486bb07b9ba79db513a67c0f430b043761f4c17f0a1c063e001d" exitCode=0 Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.004328 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkd8k" event={"ID":"3688a7b5-c948-4eab-8be8-2206c13a2af4","Type":"ContainerDied","Data":"e4f07d6b243c486bb07b9ba79db513a67c0f430b043761f4c17f0a1c063e001d"} Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.418713 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.573478 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-utilities\") pod \"3688a7b5-c948-4eab-8be8-2206c13a2af4\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.573838 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-catalog-content\") pod \"3688a7b5-c948-4eab-8be8-2206c13a2af4\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.574148 4721 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svlz6\" (UniqueName: \"kubernetes.io/projected/3688a7b5-c948-4eab-8be8-2206c13a2af4-kube-api-access-svlz6\") pod \"3688a7b5-c948-4eab-8be8-2206c13a2af4\" (UID: \"3688a7b5-c948-4eab-8be8-2206c13a2af4\") " Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.574726 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-utilities" (OuterVolumeSpecName: "utilities") pod "3688a7b5-c948-4eab-8be8-2206c13a2af4" (UID: "3688a7b5-c948-4eab-8be8-2206c13a2af4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.575371 4721 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.580628 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3688a7b5-c948-4eab-8be8-2206c13a2af4-kube-api-access-svlz6" (OuterVolumeSpecName: "kube-api-access-svlz6") pod "3688a7b5-c948-4eab-8be8-2206c13a2af4" (UID: "3688a7b5-c948-4eab-8be8-2206c13a2af4"). InnerVolumeSpecName "kube-api-access-svlz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.596053 4721 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3688a7b5-c948-4eab-8be8-2206c13a2af4" (UID: "3688a7b5-c948-4eab-8be8-2206c13a2af4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.676847 4721 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svlz6\" (UniqueName: \"kubernetes.io/projected/3688a7b5-c948-4eab-8be8-2206c13a2af4-kube-api-access-svlz6\") on node \"crc\" DevicePath \"\"" Feb 02 14:30:47 crc kubenswrapper[4721]: I0202 14:30:47.676893 4721 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3688a7b5-c948-4eab-8be8-2206c13a2af4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 14:30:48 crc kubenswrapper[4721]: I0202 14:30:48.015826 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkd8k" event={"ID":"3688a7b5-c948-4eab-8be8-2206c13a2af4","Type":"ContainerDied","Data":"c375a82bdfcb0486ffd2febfb8cf261cb67c6154240c2840964db47a65af8885"} Feb 02 14:30:48 crc kubenswrapper[4721]: I0202 14:30:48.015882 4721 scope.go:117] "RemoveContainer" containerID="e4f07d6b243c486bb07b9ba79db513a67c0f430b043761f4c17f0a1c063e001d" Feb 02 14:30:48 crc kubenswrapper[4721]: I0202 14:30:48.015954 4721 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkd8k" Feb 02 14:30:48 crc kubenswrapper[4721]: I0202 14:30:48.037812 4721 scope.go:117] "RemoveContainer" containerID="f84d65de81662ae82217c9ed34d813c8add0279b72245d302e8ae2471cca8b71" Feb 02 14:30:48 crc kubenswrapper[4721]: I0202 14:30:48.061711 4721 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkd8k"] Feb 02 14:30:48 crc kubenswrapper[4721]: I0202 14:30:48.068227 4721 scope.go:117] "RemoveContainer" containerID="bdd9f70acfc914613f4ccaf5ae253dff448e5366e8d44d9c8f70f0811a2239c0" Feb 02 14:30:48 crc kubenswrapper[4721]: I0202 14:30:48.084172 4721 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkd8k"] Feb 02 14:30:48 crc kubenswrapper[4721]: I0202 14:30:48.410189 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:30:48 crc kubenswrapper[4721]: E0202 14:30:48.410803 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:30:48 crc kubenswrapper[4721]: I0202 14:30:48.425442 4721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3688a7b5-c948-4eab-8be8-2206c13a2af4" path="/var/lib/kubelet/pods/3688a7b5-c948-4eab-8be8-2206c13a2af4/volumes" Feb 02 14:31:03 crc kubenswrapper[4721]: I0202 14:31:03.410223 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:31:03 crc kubenswrapper[4721]: E0202 14:31:03.411284 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:31:14 crc kubenswrapper[4721]: I0202 14:31:14.432562 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:31:14 crc kubenswrapper[4721]: E0202 14:31:14.433382 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:31:28 crc kubenswrapper[4721]: I0202 14:31:28.410332 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:31:28 crc kubenswrapper[4721]: E0202 14:31:28.413404 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:31:42 crc kubenswrapper[4721]: I0202 14:31:42.410554 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:31:42 crc kubenswrapper[4721]: E0202 14:31:42.411369 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:31:55 crc kubenswrapper[4721]: I0202 14:31:55.409954 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:31:55 crc kubenswrapper[4721]: E0202 14:31:55.410650 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:32:09 crc kubenswrapper[4721]: I0202 14:32:09.410325 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:32:09 crc kubenswrapper[4721]: E0202 14:32:09.411539 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:32:22 crc kubenswrapper[4721]: I0202 14:32:22.409451 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:32:22 crc kubenswrapper[4721]: E0202 14:32:22.412114 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:32:34 crc kubenswrapper[4721]: I0202 14:32:34.409621 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:32:34 crc kubenswrapper[4721]: E0202 14:32:34.410419 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:32:47 crc kubenswrapper[4721]: I0202 14:32:47.410922 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:32:47 crc kubenswrapper[4721]: E0202 14:32:47.412391 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:32:59 crc kubenswrapper[4721]: I0202 14:32:59.410517 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:32:59 crc kubenswrapper[4721]: E0202 14:32:59.412387 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:33:12 crc kubenswrapper[4721]: I0202 14:33:12.410028 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:33:12 crc kubenswrapper[4721]: E0202 14:33:12.410843 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:33:25 crc kubenswrapper[4721]: I0202 14:33:25.410053 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:33:25 crc kubenswrapper[4721]: E0202 14:33:25.411015 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:33:38 crc kubenswrapper[4721]: I0202 14:33:38.411595 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:33:38 crc kubenswrapper[4721]: E0202 14:33:38.412318 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:33:50 crc kubenswrapper[4721]: I0202 14:33:50.417246 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:33:50 crc kubenswrapper[4721]: E0202 14:33:50.418057 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:34:05 crc kubenswrapper[4721]: I0202 14:34:05.409419 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:34:05 crc kubenswrapper[4721]: E0202 14:34:05.412862 4721 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-rppjz_openshift-machine-config-operator(bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877)\"" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" podUID="bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877" Feb 02 14:34:20 crc kubenswrapper[4721]: I0202 14:34:20.417745 4721 scope.go:117] "RemoveContainer" containerID="12a34d4692995e23be9650f3fe27555782fc4f8063d028a49b067d6507a1f26a" Feb 02 14:34:21 crc kubenswrapper[4721]: I0202 14:34:21.648057 4721 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-rppjz" event={"ID":"bc8c3bf4-0f02-47a1-b8b8-1e40a8daa877","Type":"ContainerStarted","Data":"ef58b4bdfc65d5d05d4e39085c30bb8dde9fa0348a65ba8a0a6ea9615c8f01dc"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515140133110024432 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015140133110017347 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015140117513016503 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015140117513015453 5ustar corecore